Restarting with adam


I am training my network with early stopping strategy. I start with a higher learning rate, and based on validation loss, I need to restart training from an earlier snapshot.

I am able to save/load snapshot with model and optimizer state_dicts. No problem with that.

My question is, once I restart training, how do I set the learning rate of adam again? Should I restart adam fresh instead of using a state_dict or should I use
optimizer.param_groups[0][‘lr’] = lr to adjust learning rate with loaded optimizer state_dict?

For example,
I train my network with lr = 1e-6 for 5 epochs, saved model and optimizer state_dict.
I am now restarting from epoch 6, but I need lr = 1e-7 instead. What is the best approach for this?


Are you using a learning rate scheduler at the moment or are you manipulating the lr manually?
I would try to load the state_dict and set the learning rate to the desired value.
An uninitialized Adam optimizer might create loss spikes when the training continues, which might take some training again to lower it to the previous value.

1 Like

I’m manipulating it manually as I restart learning when validation loss stops decreasing. I’m just confused what is the right way to manually adjust learning rate, specially when using Adam. I did use ReduceLROnPlateau, but it didn’t result in desired behavior. I didn’t know that behavior about Adam. Thanks for clearing it up!