How to create multiple instances of a model with different config inside a wrapper function

class wrapper(nn.Module):
def init(self, input_size, combined_input_size, hidden_size, num_classes, number_of_vars):
self.m1 = var(input_size=input_size[0], hidden_size=hidden_size, num_classes=num_classes, p=0.0)
self.m2 = var(input_size=input_size[1], hidden_size=hidden_size, num_classes=num_classes, p=0.0)
self.m3 = var(input_size=input_size[2], hidden_size=hidden_size, num_classes=num_classes, p=0.0)
self.m4 = var(input_size=input_size[3], hidden_size=hidden_size, num_classes=num_classes, p=0.0)
self.m5 = var(input_size=input_size[4], hidden_size=hidden_size, num_classes=num_classes, p=0.0)

def forward(self, var_list):
    x = F.normalize(self.m1(var_list[0]), p=2, dim=1)
    y = F.normalize(self.m2(var_list[1]), p=2, dim=1)
    z = F.normalize(self.m3(var_list[2]), p=2, dim=1)
    h = F.normalize(self.m4(var_list[3]), p=2, dim=1)
    w = F.normalize(self.m5(var_list[3]), p=2, dim=1)
    return[x, y, z, h, w], dim=1)

How can I make all this calls inside a loop? or is there a better way to do this?

@ptrblck, I would appreciate your suggestions. Thanks in advance :slight_smile:

1 Like

@ptrblck is super hero of pytorch…
@ptrblck thank you for all the work you have done and the work you will do…
Your work gets the world of DL going… Without any hinderances…

1 Like

Exactly, I am a big fan of @ptrblck.

1 Like

Ha, thanks for the kind words @AbraMunna @Bibhabasu_Mohapatra

I assume you would like to create self.mX in a loop and also use a loop in the forward?
If var is returning a plain Python object, i.e. no PyTorch nn.Module, you could use setattr/getattr for both.
However, based on the usage in forward I would guess self.mX are modules, so using an nn.ModuleList would be the right approach here. Something like this should work:

class wrapper(nn.Module):
    def __init__(self, input_size, hidden_size, number_of_vars):
        self.module_list = nn.ModuleList()
        # assuming you want to create linear layers
        layer = nn.Linear
        for idx in range(number_of_vars):
            self.module_list.append(nn.Linear(input_size[idx], hidden_size))

    def forward(self, input_list):
        assert len(input_list)==len(self.module_list), "different lengths"
        out = [F.normalize(module(input)) for module, input in zip(self.module_list, input_list)]
        out =, dim=1)
        return out

input_size = [2, 3, 4, 5, 6]
model = wrapper(input_size=input_size, hidden_size=16, number_of_vars=5)

batch_size = 4
input_list = [torch.randn(batch_size, s) for s in input_size]
out = model(input_list)
1 Like

Thank you very much @ptrblck . It is really helpful! I appreciate your gracious help

Actually I am trying to create a different model (with linear layers) for different variables. That’s why I wrote one model and created k instances of that model where k is the number of variables.
Its not one linear layer for one variable !

My var model look like the following,

class var(nn.Module):
def init(self, input_size, hidden_size, num_classes, p=0.0, l=1):
layers = []
for i in range(l):
if i == 0:
layers.append(nn.Linear(input_size, hidden_size))
layers.append(nn.Linear(hidden_size, hidden_size))
layers.append(nn.Dropout(0.2)) = nn.Sequential(*layers)
def forward(self, x):
out =
return out

You could wrap the linear and dropout layer into an nn.Sequential module and append it to the nn.ModuleList. However, for more complicated constructs I would probably just use the for loop for the sake of readability. It’ll be executed once during the model initialization so I would claim no performance impact would be visible as it’s not in the hot path.

Make sense! Thank you

Hi @ptrblck, If in the wrapper the models, instead of instantiating the models, I would like to load them from a saved model. How can I do? The idea is each of the model will be trained separately and saved in the memory using ‘, m_path)’ this command. Now I want to load the saved models and remove the classifier layer. Because, I want to concat the embeddings (hidden layer) for further use. A code snipped of the idea would look like following,

class wrapper(nn.Module):
    def __init__(self, input_size, hidden_size, number_of_vars):
        self.module_list = nn.ModuleList()
        for idx in range(number_of_vars):
            model = var()
            # remove last layer somehow don't know how? 

The forward would be as usual! Thanks in advance

You can “remove” the last linear layer by replacing them with an nn.Identity layer via:

model = var()
model.last_linear_layer = nn.Identity()

Alternatively, you could also write a custom model implementation and skip this layer entirely, but based on your usage the replacement with Identity would be the simplest approach.

Thank you very much for your gracious help @ptrblck