Parameters have gradients but optimizer does not update them?

I have a situation where I do:

opt = optim.Adam(model.parameters(), lr = 0.001)
out = model(input)
out.backward()
opt.step()

when I look at the model parameters, they do have non-zero gradients, but opt.step() does not update the parameters. What would I look for in this type of situation?

I should mention that I override the model.parameters() function to return an array of two specific parameters (because my model does some custom stuff that is not in any of the standard layers).

What do you mean by

?
The optimizer will only update the parameters that are passed in the constructor, and if model.parameters() do not return all the parameters, then it won’t update the parameters that are not returned.

sorry I meant to say, when I defined the model class, I override the parameters method to return my custom params. I’m not overriding them after doing backprop.
This was a strange situation, and now it appears to be working fine.