If I leave out my model parameters in the optimizer definition and leaved out parameters are not set to param.requires_grad = False, will these parameters be changed?
No, but they will accumulate gradients and using computation and memory to do so, which may or not be significant. You might be in for surprises when you do add them later.
To permanently avoid gradients/updates doing both would be cleanest because you don’t rely in context (à la “we calculate grads, but they are not in the optimizer” or “we pretend to optimizer but don’t calculate grads”) - the Python Zen says explicit is better than implicit. If for some reason (distributed?) no grad becomes grad zero later, you don’t have regularisation messing with your params etc.