I’ve been trying to define a neural net using some for-loops so that I can more easily change the structure of the neural network without having to type a bunch of extra statements. Basically, in the __init__()
method of my net I have stuff like for i in range(n_layers): self.dense_layers.append(nn.Linear(10,10))
and then in the forward
method I have stuff like for layer in self.dense_layers: x=layer(x)
. This way if I want to add more layers, I just have to increase n_layers
without having to type anything else.
The net actually runs if I put some arbitrary (random) input into it; however, when I go to print the parameters in the network or print the network itself, it seems almost all the layers are missing. I see only a few layer’s parameters and a few layer’s names in the net. Is initializing a network this way not allowed?