Hi, I have a related question regarding this problem. If I want to update the parameters of model1 while fixing model2, then should I have to set requires_grad=False for params in model2.parameters()? Actually, I guess that setting optimizer as torch.optim.SGD(model1.parameters()) will not change model2.parameters. I’m so confused