Weight Gradients Not Same for Same Input

I am trying calculate weight gradients of a layer w.r.t CE loss function given an input. However, if I do the inference for multiple times, for same input I am getting different weight gradients. I am using ResNet-18 model for this. What can be the cause for this?

I cannot reproduce the issue by checking the weight gradient of a random layer:

model = models.resnet18()
x = torch.randn(1, 3, 224, 224)

out = model(x)
out.mean().backward()
print(model.layer4[0].conv1.weight.grad.abs().sum())

model.zero_grad()
out = model(x)
out = model(x)
out = model(x)
out.mean().backward()
print(model.layer4[0].conv1.weight.grad.abs().sum())

and get the same grad stats.
Could you post a minimal, executable code snippet showing the issue?

I found the issue by looking at your code. I missed zero_grad() and after adding that code works fine. Thanks a lot!