Reset Optimizer for retraining

Hi all,
I am currently implementing a method that needs a model to be trained multiple times on different datasets but while keeping identical architecture, optimizer, etc. I therefore need to reset model weights, optimizer stats and so on multiple times.
I am aware that I can reset the model weights with for _, module in model.named_children(): module.reset_parameters() but is there some similar method for optimizers?

As of now the only trick I found is to recreate it, so I wonder if I can just reset its parameters directly. Here is how I recreate it, please tell me if it’s wrong:

new_optimizer = old_optimizer.__class__(model, **old_optimizer.defaults)

Thanks!

I think recreating the optimizer is the right approach.
Besides calling reset_parameters() on each module, you could also recreate both, the model and optimizer, in case this would fit your use case.