Using two schedulers?

Hi Guys,

im currently trying to train my net with the adam optimizer.
For this i want to start with a learning rate of 1e-6 and slowly increase it to 1e-4 after 10,000 steps.
After that i want to decrease the learing rate by 0.5 every 100k steps.

I tried using two MultiStepLR schedulers but stepping both right after another renders the first one useless. Also the base learning rate of both schedulers are automatically set to the same value.

So to my question:
Is there a elegant way of including this scenario into one scheduler or am I bound to step each exclusivly and fixing the base learning rate of one them to make it work?

Thanks a lot in advance!
Fabi

I think the easiest way would be to use the LambdaLR and to define your learning rate manipulation in a separate function.

By the way, you could also chain schedulers, where an if statement triggers the second one at the epoch you want.

The SequentialLR is applicable too.