ReduceLROnPlateau scheduler not working

I was wondering if there was any sort of issue with torch.optim.lr_scheduler.ReduceLROnPlateau in version 0.3.1b0+2b47480!!!
Since as soon as I switch to using scheduler my loss stays almost constant!
Here is the code I am using:

optimizer = torch.optim.Adam(model.parameters(), lr=0.00003)
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer, patience=300, verbose=True, min_lr=0.00000001)

And I use the following in my training loop:

    #optimizer.step()    
    scheduler.step(loss.data[0])

Thanks

You still have to call optimizer.step().
The scheduler will just adjust the learning rate. It won’t perform the weight updates.

1 Like