Print last updated learning rate in Adam optimizer

If we use an Adam optimizer like this optimizer = optim.Adam(model.parameters(), lr=1e-3) I think learning rate is updating by betas=(0.9, 0.999) which is lr decay at two momentum. please correct me if I am saying wrong.

Now, the question is how can I print the last learning rate of the model after training? the model saved by torch.save(model.state_dict(), 'full_data_train_exp3.pth')

2 Likes