How to group bunch of layers together?

I have a model, it is a bit complicated, and I want parts of it to be grouped under one name and other part, ditto. The reason why I want this is so that I can train the parts individually, specifically, so that I can pass the parameters of that part alone to the optimizer. How can I do that?

To be concrete, here’s a snippet of my code:

        self.feat1 = Cconv1d(in_channels=30, out_channels=60, kernel_size=2, stride=1, padding=2)
        self.feat2 = Cconv1d(in_channels=60, out_channels=120, kernel_size=3, stride=2, padding=0)
        # self.feat3 = nn.MaxPool1d(kernel_size=4)
        self.featl1 = Clinear(6360, 500)
        self.featl2 = Clinear(500, 31)
        self.featl3 = nn.Softmax(dim=1)

I want all of those to be under model.part1
PS: I know the Sequntial thing, but I don’t want it because I have special forward between my layers. Any other solution? or a hacky way to pass parameters of those layers alone to the optimizer?

You could create a custom submodule inside your main module:

class MySubmodule(nn.Module):
    def __init__(self):
        super(MySubmodule, self).__init__()
        self.feat1 = nn.Conv2d(1, 1, 3, 1, 1)
        # ...
        
    def forward(self, x):
        x = self.feat1(x)
        return x
    
class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.part1 = MySubmodule()
        
    def forward(self, x):
        x = self.part1(x)
        return x

Alternatively, you could just create lists with the necessary parameters and pass them to the corresponding optimizer:

params = list(model.feat1.parameters()) + list(model...)
optimizer = torch.optim.SGD(params, lr=1e-3)
3 Likes

I :heart: modularity. Thanks.

1 Like