Using for-loops in net initialization

I’ve been trying to define a neural net using some for-loops so that I can more easily change the structure of the neural network without having to type a bunch of extra statements. Basically, in the __init__() method of my net I have stuff like for i in range(n_layers): self.dense_layers.append(nn.Linear(10,10)) and then in the forward method I have stuff like for layer in self.dense_layers: x=layer(x). This way if I want to add more layers, I just have to increase n_layers without having to type anything else.

The net actually runs if I put some arbitrary (random) input into it; however, when I go to print the parameters in the network or print the network itself, it seems almost all the layers are missing. I see only a few layer’s parameters and a few layer’s names in the net. Is initializing a network this way not allowed?


Basically I’d like to know if there’s any way (or best way) to make creating a custom neural net not require me to type out all the layers. So I can create an API like net = ConvNet(network_parameters) and have it build the neural network that I want.

Try to use a nn.ModuleList for self.dense_layers, which will properly register all your parameters.


That appears to do the trick! Thanks!