NN with configurable number of layers

I am trying to create a fully connected network where the number of layers is a parameter that is chosen at initialization. A simplified version of my code is:

class MyNet(nn.Module):

  def __init__(self, input_size, num_layers, layer_size, output_size):
    super(MyNet, self).__init__()

    self.layers = []
    self.layers.append(nn.Linear(input_size, layer_size))
    for i in range(1, num_layers-1):
      self.layers.append(nn.Linear(layer_size, layer_size))
    self.layers.append(nn.Linear(layer_size, output_size))

so basically I am storing the layers in a list. The forward method works properly. Then, I define the optimizer, by doing something like:

net = MyNet(8, 5, 2, 1) # neural net with 5 layers of size 2
optimizer = optim.Adam(net.parameters(), lr=0.1, weight_decay=0.0)

but PyTorch gives an error, saying that optimizer got an empty parameter list. This means that, when the object net is created, the parameters of the linear layers stored in the list net.layers are not being added to the parameters list of the object. Is there a way to overcome this limitation?

Thank you in advance.

2 Likes

Make self.layers an nn.ModuleList rather than a regular list, and pass net.parameters() to the optimizer constructor, not net. I.e.

optimizer = optim.Adam(net.parameters(), lr=0.1, weight_decay=0.0)
2 Likes

Yes, I was passing net.parameters() to the optimizer, not net. That was a typo when I was writing the post here in the forum.

Anyway, I was not aware of the existence of nn.ModuleList and it actually solves my problem! Great, thank you!

1 Like

If you don’t want to use nn.ModuleList, I think this should work:

optimizer = optim.Adam([param.parameters() for param in net.layers], 
lr=0.1, weight_decay=0.0)