The error message is to alert you to a backwards incompatible change; our apologies that the error message isn’t clear, we will correct that.
Essentially the issue is that previously self.data
would give you the same underlying Tensor as self
(this is legacy behavior from when Variable’s “wrapped” Tensors), so metadata changes (e.g. size, storage, etc.) made to .data
would also change self
. Now self.data
gives you a (non-differentiable) alias with its own copy of the metadata, so metadata changes made to self.data
will not be reflected in self.
The fix depends on what you were trying to achieve. If you were trying to change self
without autograd knowing about it, you should put the the change in a torch.no_grad()
block and stop referencing .data
.
Here’s a simplified example.
Before:
>>> x=torch.randn((2,3,4), requires_grad=True)
>>> y=x.sum(dim=2)
>>> z = torch.randn(7)
>>> y.data.set_(z)
>>> y
tensor([[-2.8514, 0.5746, -1.3324],
[ 1.1557, 0.7676, 1.8141]], grad_fn=<SumBackward2>)
Now (error):
>>> x=torch.randn((2,3,4), requires_grad=True)
>>> y=x.sum(dim=2)
>>> z = torch.randn(7)
>>> y.data.set_(z)
RuntimeError: set_storage is not allowed on Tensor created from .data or .detach()
Corrected:
>>> x=torch.randn((2,3,4), requires_grad=True)
>>> y=x.sum(dim=2)
>>> z = torch.randn(7)
>>> with torch.no_grad():
y.set_(z)
>>> y
tensor([[-2.8514, 0.5746, -1.3324],
[ 1.1557, 0.7676, 1.8141]], grad_fn=<SumBackward2>)