The tutorial says( http://pytorch.org/tutorials/beginner/former_torchies/autograd_tutorial.html#gradients): so if you even want to do the backward on some part of the graph twice, you need to pass in retain_variables = True during the first pass
The example it gives is:
from torch.autograd import Variable
x = Variable(torch.ones(2, 2), requires_grad=True)
y = x + 2
y.backward(torch.ones(2, 2), retain_graph=True)
z = y * y
gradient = torch.randn(2, 2)
But when I try this code with retain_graph=True and retain_graph=False, they both works with no error, and the gradients are corrects.
Anything wrong with the example?