Combine two resnet18 models

I already tried and also looked for other posts on the forum but I don’t get it.
I have two models and I want to combine them.
can anyone give a quick example?

class ResNet18Server(nn.Module):

    def __init__(self, config):
        super(ResNet18Server, self).__init__()
        self.logits = config["logits"]
        self.cut_layer = config["cut_layer"]

        self.model = models.resnet18(pretrained=False)
        num_ftrs = self.model.fc.in_features

        self.model.fc = nn.Sequential(nn.Flatten(),
                                      nn.Linear(num_ftrs, self.logits))

        self.model = nn.ModuleList(self.model.children())
        self.model = nn.Sequential(*self.model)

    def forward(self, x):
        for i, l in enumerate(self.model):
            if i <=self.cut_layer:
            x = l(x)
        return x

Could you explain what “combine” would mean in this context?

I want to merge them. They are trained on different datasets and I want one model that have parameters/weights from both. I mean that new model cover both models

AFAIK, combining models does not work that way.

One of the ways to “combine” is to perform ensembling of these models as explained below:

Have a two-branch architecture with these 2 resnet18 models and get features from each branch (any layer of choice). Then combine these features (for example, concatenate these features) and to pass it through an MLP to predict the target variables. Finetune this setup (either by freezing the backbone models or having less Learning rate for them) in the target task.