[quick question] Is optimizer updated in this case?

So I loaded a pertrained model and wanted to resume the training. If I modified the learning rate of this optimizer, would that be updated? or the net would continue to use whatever lr that has been saved previously in this case? Just want to make sure. thanks!

My codes are as follows:

optimizer = optim.Adam(net.parameters(),lr=0.001)

def train():

      net.train(True)

      ....
      loss = criterion(...)
      loss.backward()
      optimizer.step()

if __name__ == "__main__":

      net.load_state_dict(torch.load('/xxx/xxx/weights_.pt'))
      train()

the net does not contain learning rate information, the optimizer does. So in your case the lr 0.001 will be used (according to your example)

1 Like