Gradient through image processor

So in my code i am computing the gradient of the image given as input to a resnet. I use AutoImageProcessor from transformers library. Is it that the gradient will not flow through AutoImageProcessor ? For example in below;

from transformers import AutoImageProcessor

image_processor = AutoImageProcessor.from_pretrained(“microsoft/resnet-18”)

model_input = image_processor(image, return_tensors=“pt”)

outputs = resnet(**model_input, output_hidden_states=True)

When i backpropogate the loss from the model, i obtain gradient at model_input (using model_input.grad), however image.grad gives me an error " The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won’t be populated during autograd.backward(). If you indeed want the .grad field to be populated for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. "

model_input is an intermediate activation and thus not a leaf-tensor as the warning describes. Did you try to call .retain_grad() on it as suggested?

Yes , i tried retain_grad() but still get the same issue.

Could you post a minimal and executable code snippet reproducing the issue, please?