What’s the easiest way to reset an optimizer stats, such as Adam’s moving averages, while keeping the same weights?
To make an example, suppose I have a model and I have pretrained it on a dataset using Adam. Now, I want to reset Adam’s stats and train the model on another dataset, while keeping the same parameters to be optimized. What’s the best way to reset the optimizers state, except for param_groups?
Unfortunately, I don’t have access to the parameters such as the learning rate. The only data I have are a the model and the optimizer, that’s why I would like to “reset” the optimizer itself. Currently what I do is the following:
Thanks for the answer, I considered also that way, however I was confused by param_groups and defaults and decided to tweak directly the state. All in all, I would consider to add a method reset, to be able to reuse the same optimizer. Would it make sense?
That sounds like a good idea.
It would just reset the internal states for some optimizers or do you have anything else in mind?
Would you like to create this feature request on GitHub and explain your use case a bit?
Also, would you be interested on implementing this feature in case the proposal gets some positive feedback?