Requires_ grad and false invalid during repeated freezing and unfreezing

I’m using requires_ grad=true and false During repeated freezing and thawing, it is observed that a parameter v requires_ It is no problem for grad to change from true to false during training. Parameter v stops updating, but once it changes from false to true again, parameter v will always be updated, even if it requires_ The grad attribute has changed to false several times.

What is the problem?

Parameters can be still updated even if they are frozen if an optimizer with internal running states is used. E.g. Adam lazily initialized running estimates for each updated parameter and will use these to update the parameter even if their gradients are set to zero.
You could delete their .grad attribute, which would let the optimizer skip these parameters in its step() method, or you could also use optimizer.zero_grad(set_to_none=True), which will also delete the .grad attributes.

1 Like

thank you!
Excuse me,
optimizer.zero_grad(set_to_none=True)
is valid.

Why is it? Will it affect the accuracy of training?