Are there any ideas to compute the gradients of the provided prediction label (w.r.t. the image)?

Hi all,

We have a CNN-based model, which classifies an image into two categories.
We have a question. How can we compute the gradients of the provided prediction label (w.r.t. the image)?
Just like the tensorflow implementation: https://github.com/ankurtaly/Integrated-Gradients/blob/dad3d7f2a38c8feb754378c94bd5bfdd81300552/attributions.ipynb

Thanks.

Set X.requires_grad = True for your input image X . Run X through your network. By calling .backward on a scalar value calculated from your network output, gradients of that scalar with respect to the input image X will be calculated and stored in X.grad.