Hello community,
in order to freeze some parameter I could set the feature param.requires_grad to False. However my question is about optimizer, for example:
optimizer = torch.optim.Adam([
{'params': model.feature_extractor.parameters()},
{'params': model.covar_module.parameters()},
{'params': model.mean_module.parameters()},
], lr=0.01)
If I leave out my model parameters in the optimizer definition and leaved out parameters are not set to param.requires_grad = False, will these parameters be changed?
Thank a lot!