Use torch.autograd.grad and don’t call backward()
on loss
>>> x = torch.autograd.Variable(torch.FloatTensor([3.0]), requires_grad=True)
>>> y = 4*x
>>> loss = (y - 10)**2
>>> torch.autograd.grad(loss, y)
(Variable containing:
4
[torch.FloatTensor of size 1]
,)
>>> x.grad == None
True