Hi, I want to adjust the learning rate of part of my model, let’s call it PartA using lr_schedulerA
And PartB using lr_schedulerB.
I didn’t find a way to do this, the only solution I found is to duplicate my optimizer, and put the parameters of each part in the corresponding optimizer:
Your approach looks reasonable as you are not duplicating the optimizer but rather use two different optimizers and schedulers for different parts of the model.
I think this use case looks clean and is quite easy to understand so I would stick to it and not hack around param_groups of the optimizer to pass them to different schedulers.
I have two parts in my model let’s say ‘feature’ and ‘classifier’. In optimizer, I have defined two learning rates for these parts. I am not sure how I can set up the schedular for these two?
@NoobCoder Hi, I encounter this problem too. Could you share how you set up the OneCycleLR for two different parameter groups with different learning rate? Thanks a lot!