I’m receiving the error “one of the variables needed for gradient computation has been modified by an inplace operation” if I run the code below. To correct the error, I have to either set ‘iterations’ to 1 or say
x.data = x.data + z[0,:,:] instead of
x = x + z[0,:,:]. Is there a way to overcome this without having to resort to either of the above workarounds? I ran this code using Pytorch 0.4.0 on Windows 10. On Redhat, curiously, the code hangs. @tom - any ideas please? Thank you.
iterations = 2 m,q,p,n = 2,3,4,3 x = torch.randn(m,q) y = torch.randn(p,n).requires_grad_(requires_grad=True) for i in range(iterations): z = torch.einsum('mq,pn->pmn',(x,y)) # x.data = x.data + z[0,:,:] # WORKS FINE x = x + z[0,:,:] # THROWS ERROR err = torch.sum(z) err.backward() print(y.grad)