Error when trying calculate gradient

When I try the following

x = Variable(torch.Tensor([1]), requires_grad=True)
w = Variable(torch.Tensor([2]), requires_grad=True)
b = Variable(torch.Tensor([3]), requires_grad=True)

y = w * x  + b

y.backward()   #calculates gradients
print(w.grad)   # dy/dw = x  --> 1

it works. but when I build one more variable like this,

z = 2 * y

I get the following error

RuntimeError                              Traceback (most recent call last)
<ipython-input-49-c111445625f1> in <module>()
      1 z = 2 * y
----> 3 z.backward()

/home/paarulakan/environments/python/pytorch-py35/lib/python3.5/site-packages/torch/autograd/ in backward(self, gradient, retain_variables)
    144                     'or with gradient w.r.t. the variable')
    145             gradient =
--> 146         self._execution_engine.run_backward((self,), (gradient,), retain_variables)
    148     def register_hook(self, hook):

/home/paarulakan/environments/python/pytorch-py35/lib/python3.5/site-packages/torch/autograd/_functions/ in backward(self, grad_output)
     47     def backward(self, grad_output):
---> 48         a, b = self.saved_tensors
     49         return grad_output.mul(b), maybe_view(grad_output.mul(a), self.b_size)

RuntimeError: Trying to backward through the graph second time, but the buffers have already been freed. Please specify retain_variables=True when calling backward for the first time.

What is wrong with that?

Just add retain_variables=True to the first backward() call when you intend to perform backpropagation multiple times through the same variables.

y.backward(retain_variables=True)   #calculates gradients
1 Like