What is the behavior of passing model parameters with `requires_grad == False` to an optimizer?

I would recommend to filter them out, as this would stick to the Python Zen Explicit is better than implicit.

An unwanted side effect of passing all parameters could be that the parameters, which are frozen now, are still being updated, if they required gradients before, and if you are using an optimizer with running estimates, e.g. Adam.

1 Like