Hi, I have a question w.r.t how to obtain the high order derivative of a custom torch.autograd.Function
. If the forward
and backward
of the function is not implemented with pytorch, but scipy, with the backward
in the Function
, the first order derivative could be obtained. However, what about the high order derivative?
You won’t be able to get higher-order gradients for free if your backward isn’t written in a differentiable way using torch ops. You’ll need to write another custom function for the backward.
See Double Backward with Custom Functions — PyTorch Tutorials 1.13.0+cu117 documentation
1 Like