is there a way to compute the second, third gradient in pytorch?
can I do ( loss.backward()).backward()
Yes if you give the right arguments to these functions: see the create_graph
argument in the doc.
Also remember that .backward()
will accumulate gradients in all parameters (it does not return them), so you will need to get them from your model.
You can use autograd.grad
if you want gradients for specific inputs and have them being returned to you directly.