A question about _RequiredParameter()

I have read torch.optim.optimizer this night.
here is the address

class _RequiredParameter(object):
    """Singleton class representing a required parameter for an Optimizer."""
    def __repr__(self):
        return "<required parameter>"

required = _RequiredParameter()

I don’t know how this function work in the add_param_group

for name, default in self.defaults.items():
            if default is required and name not in param_group:
                raise ValueError("parameter group didn't specify a value of required optimization parameter " +
                param_group.setdefault(name, default)

Can somebody a nice guy tell me how the function _RequiredParameter works and if I don’t write deault lr in param_group, name is not in is True, if there is no error, default is required should be False, I am confused

This class was introduces in this PR and based on its description it seems to improve the docs slightly. However, I don’t know if it’s still used or dead code by now.

1 Like

Thanks for your respond. It’s still used, as you can see, it’s used in "def add_param_group() ", and as a judge part of a sentence. I think defaults is such as lr=0.3, so the name=‘lr’ and default=0.3.
It is so hard for me to understand why default equal required(_RequiredParameter()) can work.
Maybe I should give it up. Just remember, don’t delve into it

@ptrblck FYI this PR causes some issues with Hydra configs for torch.optim.SGD