Merged Tensor and Variable example gradient not working

In April, Tensors and Variables merged. It is my understanding from the example code that the following should work:

x = torch.ones(1, requires_grad=True)
y = torch.ones(1, requires_grad=True)
z = x+y
z.backward()
print(z.grad)

Yet z.grad is None. What gives?

z is an intermediate, whose .grad is not usually retained (x,y do have grads).
If you want the grad for z as well, use z.retain_grad() after z = x + y.

Best regards

Thomas

Ah, thank you! That clarifies things a lot.

If any developer comes across this: the Pytorch 0.4.0 migration section dealing with the Tensor-Variable merging does not mention retain_grad.