How to load saved Adam optimizer to a new Adam optimizer with different learning rate?

Hi, I have trained my model with Adam optimizer (lr=0.0001) and successfully saved the optimizer’s state_dict. Now I want to continue training with with a different learning rate (say lr=0.00005). What is the proper way to load the saved optimizer of learning rate 0.0001 to a new Adam optimizer instance with a different learning rate?

You can set all learning rates by doing

for group in optim.param_groups:
    group["lr"] = 0.001

Also using a learning rate scheduler might be helpful as well.
I hope these solutions would be helpful.

1 Like

Maybe you misunderstood what I was asking.
Say that I have trained the model for 30 epochs with:

optimizer = torch.optim.Adam(model.parameters(), lr=0.0001)

and saved the optimizer’s state_dict:

torch.save(optimizer.state_dict(), PATH)

Now I want to continue the training with a new optimizer with different learning rate:

optimizer = torch.optim.Adam(model.parameters(), lr=0.00005)

How should I load the state_dict of the previous optimizer to the new one?
Should I load it at all or I just need to use the new optimizer without any loading?