New model means new optimizer?

Hello together,

lets say we have a ReLU-NN class and for the training phase a training_backend class which handles all the optimization data and so on. Within the training_backend __init__ method there are the following assignments:

self.model = model # a NN-model of class ReLU-NN
self.optimizer = torch.optim.LBFGS(self.model.parameters(),...)
def closure():
      ...
      nn_output = self.model(self.mu_tr)
      ...
      return loss
self.closure = closure

The training_backend has also the following method to train the assigned model:

def optimize_grad_control(self, maxIt, gradIt=5):
      ...
      self.optimizer.step(self.closure)
      ...

Now imagine that i have trained the model and after that i have created a new model of the ReLU-NN class. This new model is then assigned to the training_backend with

def new_model(self, new_model):
    # assign the new model
    self.model = new_model

My question is when i want to train the new model with the optimize_grad_control method do i have to reassign a new optimizer and define a new closure function? (I have read somewhere that there is no possibility to dynamically change the model.parameters() within the optimizer.)

Yes, you would need to create a new optimizer for the new model, as the old optimizer stores references to the initially passed parameters.
Alternatively, you could add the new parameters via add_param_group, but I don’t think you would have any benefit from it.

Okay, so it is a PyTorch internal thing? Because i think that i do not need to redefine the closure function. The closure uses self.model which is a reference to model and whenever i change the object under the name self.model the closure function evaluates the new assigned model.

What do you mean by that?

I don’t know, how you’ve defined the closure method, but if it’s just using some objects, there would be no need to redefine it.

Okay, so it is a PyTorch internal thing?

Forget about it. I thought it matters that i pass the parameters as self.model.parameters() but they create a new name for it within the optimizer __init__ and when i change self.model with the new_model() method the optimizer object does not notice this change.