Can one change the learning rate of a layer after the optimizer is initialised?

I can’t find an example anywhere. Additionally I can’t find out what exactly optimizer.param_groups returns, so I don’t know whether I can filter by name to change the learning rate.

The param groups are dictionaries with the state information. If you only have one param group (this is the case if you just initialized the optimizer with model.parameters()) then optimizer.param_groups[0]['lr'] is the learning rate.
Parameter groups are useful e.g. when you want different learning rates for different parameters. For example fast.ai advocates that for finetuning.

Best regards

Thomas

1 Like

Okay I see so I must split things up when initializing. Thank you.