gspr
July 9, 2018, 7:00pm
#1
In April, Tensors and Variables merged . It is my understanding from the example code that the following should work:
x = torch.ones(1, requires_grad=True)
y = torch.ones(1, requires_grad=True)
z = x+y
z.backward()
print(z.grad)
Yet z.grad
is None
. What gives?
tom
(Thomas V)
July 9, 2018, 9:32pm
#2
z
is an intermediate, whose .grad is not usually retained (x,y do have grads).
If you want the grad for z as well, use z.retain_grad()
after z = x + y
.
Best regards
Thomas
gspr
July 10, 2018, 8:22am
#3
Ah, thank you! That clarifies things a lot.
If any developer comes across this: the Pytorch 0.4.0 migration section dealing with the Tensor-Variable merging does not mention retain_grad
.