How to Penalize Norm of End-to-End Jacobian

Hi,

You should never need .data anymore. Also keep in mind that .data break the computational graph so no gradients will flow back.

Here you want to compute the gradient in such a way that you can backward through the gradient computation. This is done with the create_graph=True flag to autograd.grad.

Also, since you give a Tensor of ones to grad_outputs, you get the sums of the columns of your Jacobian. Because the autograd only computes a vector Jacobian product.

2 Likes