Reset optimizer parameters

I have a model and optimizer already trained, then I used a learning rate finder and a method in the LRFinder API to apply this optimal LR to my optimizer optimizer.param_groups[0]['lr] = optimal_lr, now I want to create a new_model instance with new parameters and use the optimizer with applied learning rate, to do that I should reset the parameters of the optimizer like this optimizer.param_groups[0]['params'] = new_model.parameters(), but this doesn’t work, It doesn’t give good results, and the new_model doesn’t converge.

model = ...
optimizer = optim.SGD(model.parameters(), lr=3e-4, momentum=0.9)

# train them, then decide to find the optimal lr
new_model = ....
optimizer.param_groups[0]['params'] = new_model.parameters()

And the problem is not the LR, when I try to manually create a new optimizer with this new optimal LR, I get better results.

This approach sounds right, as I don’t think that assigning the parameters() generator would directly work. The optimizer assigns the parameters here so you could either try to reproduce this method or just create a new optimizer and set the found learning rate afterwards.