Custom backward as a Jacobian vector product

To implement a custom pytorch Function, we need to implement a forward and a backward function. However, if the input is a tensor and the output is a tensor, the backward function will return the Jacobian explicitly. On the other hand the autograd engine never forms the Jacobian explicitly and uses Jacobian vector products instead.

In my use-case, the Jacobian of my custom Function is a very large tensor. Is there any way to implement backward as a Jacobian vector product instead? If not, are there plans to make this a possibility in the future?

Have you tried invoking the backward function with the feature vector i.e. backward(x) instead of backward()?

I wrote a custom class which inherits autograd Function and the backward call defined as “def backward(ctx, grad_output):” . It returns the J*grad_out which is of same dimension as grad_out.