Long story short, if your buffer is registered to be non-persistent: self.register_buffer("buf", torch.randn(4, 4), persistent=False), ExecuTorch is treating the buffer to be "write before read", meaning the initial values are not important to ExecuTorch. This is commonly seen in KV cache in LLM models where you would write KV values then read it in next iteration.