My model uses custom torch.autograd.Functions, and I would like to define and compute the Hessian of it.
Since only the first derivative is defined (in the backward method), how can I define the second derivative?
I imagine it could be something like a “second_backward” method, but I haven’t found anything like that.
If you want to define a double backward directly within torch.autograd.Function you can just embed another torch.autograd.Function within the backward of your current torch.autograd.Function. An example is here