How to retrieve learning rate from ReduceLROnPlateau scheduler

Hi there,
I was wondering if someone could shed some light on the following questions:

  1. Why is ReduceLROnPlateau the only object without get_lr() method among all schedulers?

  2. How to retrieve the learning rate in this case? Previously without scheduler I would do optimizer.param_groups[0]['lr'] but now after using the scheduler and printing optimizer.param_groups[0]['lr'] I see no change in the learning rate?

  3. How to save the optimizer state if you’re using a scheduler. Previous examples I’ve seen show optimizer.state_dict() in order to save the optimizer state should we replace that with scheduler.state_dict()?

Thanks!

2 Likes
  1. The scheduler might track multiple param_groups as described here.

  2. It should still work as shown in this example:

model = nn.Linear(10, 2)
optimizer = optim.Adam(model.parameters(), lr=1e-3)
scheduler = optim.lr_scheduler.ReduceLROnPlateau(
    optimizer, patience=10, verbose=True)

for i in range(25):
    print('Epoch ', i)
    scheduler.step(1.)    
    print(optimizer.param_groups[0]['lr'])
  1. No, you should still save the optimizer’s state_dict (and also call optimizer.step(), as the scheduler is not a replacement for the optimizer). Additionally, you could also store the scheduler’s state_dict.
11 Likes

Thanks for the great answer!

By the way, is there anyway to obtain the learning rate from the ReduceLROnPlateau scheduler?
When I’m using other schedulers they all have get_last_lr() function; but since ReduceLROnPlateau is not inherited from _LRScheduler this function is not defined. Any equivalent way with get_last_lr() with this scheduler?

Thanks a lot!

1 Like

You could use the internal scheduler._last_lr attribute, the scheduler.state_dict() or alternatively you could check the learning rate in the optimizer via optimizer.param_groups[0]['lr'].
Note that the first two approaches would only work after the first scheduler.step() call.

7 Likes

Thank you so much! Your response is very helpful as always.

2 Likes