How to train only one model in multi models?

Hi, I have a question about training with multiple models in single GPU.
I want to train only the first model and just feed output from the first model to other next models.
To be more specific, I have three models.

  1. Pretrained Resnet50 : model1
  2. Pretrained MLP : model2
  3. Pretrained MLP2 : model3

in here, I want to hold model 2) and 3) but train model 1).

First, I load pretrained model and weights

model1 = models.resnet50(pretrained=True)
criterion1 = nn.L1loss()
optimizer1 = torch.optim.Adam(model1.parameters(), lr=lr)

model2 = model2()
optimizer2 = torch.optim.Adam(model2.parameters(), lr=lr)
model2.load_state_dict(pretrained_model2[‘state_dict’]
optimizer2 .load_state_dict(pretrained_model2[‘optimizer’]

model3 = model3()
optimizer3 = torch.optim.Adam(model3.parameters(), lr=lr)
model3.load_state_dict(pretrained_model3[‘state_dict’]
optimizer3 .load_state_dict(pretrained_model3[‘optimizer’]

and then for training, I try to like below:

model1.train()
model2.eval()
model3.eval()

outputs = model1(img)
outputs = model2(outputs)
outputs = model3(outputs)

loss = criterion1(outputs, labels)
optimizer1.zero_grad()
loss.backward()
optimizer1.step()

I make only one loss and optimizer for the first model1 to training, is this right to training my model?
Thank you for your time.

The general work flow should be alright, although you don’t need to create optimizers for the last two models, if you are not training them.
Also, you might want to freeze their parameters.