Change optim from adam to sgd

How do I change optimizer from adam to sgd, it seems they have different parameters; Because Adam decreases fast and SGD can lead to better result. So I want to train with Adam at first, then use SGD for fine tuning

you can create a new optimizer after a few epochs of training.

Does the model.zero_grad() and optimizer.zero_grad() have same result?

yes, it will. the new optimizer will also be given model.parameters() so it’s the same result.

1 Like