Is there a way to clone the optimizer?
optimizer = torch.optim.SGD(model.parameters(), lr=lr, nesterov=True, momentum=0.9)
I would like to use the same optimizer for multiple examples, but when I set the same, it already has some history.
The other option would be to clear the optimizer to defaults.