"UserWarning: The given buffer is not writable" when storing additional tensors in ExportedProgram.state_dict

I am getting a warning I would like to get rid of. A MWE being worth a thousand words, here we go:

import torch

class TestNet(torch.nn.Module):
    def forward(self, x: torch.Tensor) -> torch.Tensor:
        return x

ep = torch.export.export(TestNet(), (torch.zeros([10,10]),))
ep.state_dict["whatever"] = torch.zeros([5,5])  # state_dict abuse!
torch.export.save(ep, "/tmp/test.pt2")
ep2 = torch.export.load("/tmp/test.pt2")  # raises that UserWarning

Running this triggers:

/.venv/lib/python3.13/site-packages/torch/export/pt2_archive/_package.py:682: UserWarning: The given buffer is not writable, and PyTorch does not support non-writable tensors. This means you can write to the underlying (supposedly non-writable) buffer using the tensor. You may want to copy the buffer to protect its data or make it writable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at /pytorch/torch/csrc/utils/tensor_new.cpp:1581.)
tensor = torch.frombuffer(

My questions are:

  • Should I not care about this warning?
  • How bad an idea is it to abuse state_dict to store additional metadata as I am doing?
  • Is there an operation I should call on this ep.state_dict["whatever"] tensor to ensure that it does not trigger that warning when loading the model later?

Just to be clear, loading works as intended, including the “whatever” tensor. I am concerned about this warning specifically.