After training the model for a while, I save a checkpoint for model and optimizer, what should I do to change the learning rate of Adam for continue training?
You could try to access the lr
attribute of each param_group
and change the learning rate:
optimizer.param_groups[0]['lr'] = new_lr