Why x.grad = 70?

import torch
x = torch.tensor(5., requires_grad=True)
y = 2xx
x.data*=6
y.backward()
print(x.grad)

The manipulation of the .data attribute cannot be tracked by Autograd and can yield wrong results (as in this case) and other unexpected side effects.
Remove the x.data manipulation and it should work. :wink:

1 Like