Insert a trained model in the middle of another pre-trained model

I have two models. Model 1 is a simple fully connected classifier on MNIST. Model 2 is a pre-trained model. Now I want to put the model 2 between model 1 layers as follows:

model 1_1 > model 2_all > model 1_2 > model 1_3 > …

How can I do that?

You could just use the pre-trained model like any other nn.Module in your new model:

class MyPreTrainedModel(nn.Module):
    def __init__(self):
        super(MyPreTrainedModel, self).__init__()
        self.fc = nn.Linear(1, 1)
    def forward(self, x):
        return self.fc(x)

class MyModel(nn.Module):
    def __init__(self, model):
        super(MyModel, self).__init__()
        self.fc1 = nn.Linear(1, 1)
        self.pre_trained = model
        self.fc2 = nn.Linear(1, 1)

    def forward(self, x):
        x = F.relu(self.fc1(x))
        x = F.relu(self.pre_trained(x))
        x = self.fc2(x)
        return x

pre_trained = MyPreTrainedModel()
model = MyModel(pre_trained)
x = torch.randn(1, 1)
output = model(x)
1 Like

Thank you for your reply. The second model is trained. I don’t want to put the first model in second model and then train it. I have trained both models. I think I can do it as follows but I want to do it in another way. more pytorchic :smiley:

  • train both models.
  • then save weights.
  • create the third model with same number of parameters in first and second model.
  • load weights of both trained models in the third model.