Clonging the optimizer

Is there a way to clone the optimizer?

optimizer = torch.optim.SGD(model.parameters(), lr=lr, nesterov=True, momentum=0.9)

I would like to use the same optimizer for multiple examples, but when I set the same, it already has some history.

The other option would be to clear the optimizer to defaults.

Found it, model and optimizer need to go together, since optimizer uses what model has.

model = M()
optimizer = torch.optim.SGD(model.parameters(), lr=lr, nesterov=True, momentum=0.9)

So If I don’t create a new model I would still be on old model, which was a probelm I had.