Using the gradients in a loss function

To make it working w/o much change, add create_graph=True, retain_graph=True to the backward call.

However, a more efficient way is:

gradspred, = autograd.grad(ypred, inp, 
                           grad_outputs=ypred.data.new(y_pred.shape).fill_(1),
                           create_graph=True)
loss = ...
loss.backward()
6 Likes