Pytorch combine two models into a single class and access their parameters from the parent class


So I have a model in pytorch that looks like this:

class LAS(nn.Module):
    def __init__(self, listener, speller):
        super(LAS, self).__init__()
        self.listener = listener
        self.speller = speller

    def forward(self,input):
        input = self.listener(input)
        input = self.speller(input)
        return input

Where listener and speller are two instances of a class inheriting from nn.Module as well:

class Speller(nn.Module) #Each of them have their layers and neurons but I will omit them for clarity
class listener(nn.Module)

I know I can do LAS.speller.parameters() to access the speller parameters but I would like to combine both so I can pass then to my optimizer like this:

optimizer = torch.optim.Adam(LAS.parameters(),

Instead that doing it like this:

optimizer = torch.optim.Adam(
    [{"params": LAS.listener.parameters()}, {"params": LAS.speller.parameters()}],

When I do LAS.parameters() will it be equal to actually passing both listener & speller parameters to the optimizer separately?

1 Like

In your case, yes in short. But passing parameters separately enables you to assign different initial learning rate per model as described here.