To implement a custom pytorch Function, we need to implement a forward
and a backward
function. However, if the input is a tensor and the output is a tensor, the backward
function will return the Jacobian explicitly. On the other hand the autograd engine never forms the Jacobian explicitly and uses Jacobian vector products instead.
In my use-case, the Jacobian of my custom Function is a very large tensor. Is there any way to implement backward
as a Jacobian vector product instead? If not, are there plans to make this a possibility in the future?