How to define Hessian of a custom torch.autograd.Function

My model uses custom torch.autograd.Functions, and I would like to define and compute the Hessian of it.
Since only the first derivative is defined (in the backward method), how can I define the second derivative?
I imagine it could be something like a “second_backward” method, but I haven’t found anything like that.

Thank you.

Hi Felix!

Does torch.autograd.functional.hessian() fit your use case?


K. Frank

If you want to define a double backward directly within torch.autograd.Function you can just embed another torch.autograd.Function within the backward of your current torch.autograd.Function. An example is here