I have read torch.optim.optimizer this night.
here is the address
https://pytorch.org/docs/master/_modules/torch/optim/optimizer.html
class _RequiredParameter(object):
"""Singleton class representing a required parameter for an Optimizer."""
def __repr__(self):
return "<required parameter>"
required = _RequiredParameter()
I don’t know how this function work in the add_param_group
for name, default in self.defaults.items():
if default is required and name not in param_group:
raise ValueError("parameter group didn't specify a value of required optimization parameter " +
name)
else:
param_group.setdefault(name, default)
Can somebody a nice guy tell me how the function _RequiredParameter works and if I don’t write deault lr in param_group, name is not in is True, if there is no error, default is required should be False, I am confused