Hi folks,
I’ve encountered this error today.
Traceback (most recent call last):
File "testpytorch.py", line 162, in <module>
loss.backward()
File "/home/user/miniconda3/envs/torch/lib/python3.6/site-packages/torch/tensor.py", line 102, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File "/home/user/miniconda3/envs/torch/lib/python3.6/site-packages/torch/autograd/__init__.py", line 90, in backward
allow_unreachable=True) # allow_unreachable flag
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation
Here’s a MWE:
def dummy(x):
res = torch.mm(x, x.transpose(dim0=0, dim1=1))
res = res * -2
res = res + (x**2).sum(dim=1)[:, None]
res = res + (x**2).sum(dim=1)
res, idxs = torch.max(res, 0)
res.flatten()[::x.size(0) + 1] = 0.0
res = torch.sqrt(res)
return res
If I swap res = torch.sqrt(res)
with res = res**(1/2)
everything seems to be working properly. I am missing sth fundamental here, but dunno what? Probably the way autograd works internally?
Any insights would be highly appreciated!