Gradient of an unrelated variable is not coming out to be zero

I have a model inceptionfeaturesmodel, that takes a (batch_size , 3, 299, 299) input image_variable and computes (batch_size , 35, 35) output names prediction. When I do prediction[0][0][0].backward() and check image_variable.grad, ideally grad of all inputs except first image should be zero(because they are not used in computing output of the first image. ). But they are not. Any idea why?

If they’re not-zero, then it probably means that all of the inputs were used to compute prediction[0][0][0]. Could you share more details about the model?

Its basically inception_v3 features. So each image in the batch will only be used to compute respective features ie image_variable[i] is an RGB image which will be used to compute prediction[i].