How are the Learning Rates calculated in the Scheduler with ExponentialLR?

Hi,

Can someone please explain how the the Learning Rates are calculated in the Scheduler with ExponentialLR?

Thank you.

Hi,

If you go to the implementation you’ll se that in each call (lr_scheduler.step()), the current learning rates are decayed by gamma, so LRi = LR_i-1 x gamma, and in terms of the base learning rate: LRi = LR0 x gamma^i.