Parameter group subset handling of optimizers

I would like to define parameter groups such that

  {'params' : module.parameters(), 'lr':0.1},
  {'params' : module.submodule.parameters(), 'lr':0.01}

I would expect that this effectively sets the learning rate for parameters but the ones of submodule to lr==0.1, and the learning rate for the parameters of submodule to 0.01.

Could someone confirm if this is indeed what is going to happen, or whether any weird behavior is expected because the learning rate of submodule is technically being set twice?


Found the answer myself by trying it out:

ValueError: some parameters appear in more than one parameter group

The question now is: Would it be possible to allow for a definition such as above without the need to explicitly iterate through parameters that were already defined in a previous group?

It depends on your use case and based on the docs you can still pass default arguments to the optimizer, which will be applied to parameters not specified explicitly. Since you are specifying two parameter groups explicitly you would need to filter them properly.

Thanks for the feedback. It would be a nice feature to have, rather than raising a ValueError

The default arguments are already supported. I don’t think passing overlapping parameter sets into different parameter groups is a nice feature, but a really hard to debug error, which is rightfully disallowed.
Use parameter groups as separate groups with their own optimizer arguments and the default arguments for the rest.