I am somewhat new to this so please bear with me. Recently, I was training a bunch of models in Pytorch, when I ran into this problem. Suppose, I have some models m1 and m2.
m1 = modelClass1()
optimizer = torch.optim.Adam(model.parameters())
m1.train()
optimizer.step()
m2 = modelClass2()
m2.train()
optimizer.step()
I forgot to reinitialize my optimizer and the loss and accuracies were changing for my first model but not for my second model (which makes sense because I didn’t set the optimizer to optimize the 2nd model’s parameters).
Considering the modelClass1() and modelClass2() are different, why is it that there was no error or anything thrown when I forgot to optimize different models with different parameters. What exactly is optimizer doing with the parameters passed as an input to it. Was looking for some insight into understanding this.