How does one make sure that the custom NN has parameters?

I was creating a custom mdl and when I printed the object out I got an empty model:

(Pdb) mdl_sgd
NN (
)

and .parameters() is empty:

list(mdl_sgd.parameters()) =  []

my class looks as follow:

class NN(torch.nn.Module):
    def __init__(self, D_layers,act,w_inits,b_inits,bias=True):
        super(type(self), self).__init__()
        # actiaction func
        self.act = act
        #create linear layers
        self.linear_layers = [None]
        for d in range(1,len(D_layers)):
            linear_layer = torch.nn.Linear(D_layers[d-1], D_layers[d],bias=bias)
            self.linear_layers.append(linear_layer)

is there a reason that this does not work?

Try something like this:

class NN(nn.Module):
    def __init__(self):
        super(NN, self).__init__()
        linear_layers = []
        for i in range(10):
            linear_layers.append(nn.Linear(5,5))
        self.net = nn.Sequential(*linear_layers)
1 Like

Also, another easier way to look at your weights would be to do this (in a REPL of some sort):

model = NN()
model.state_dict()

How about ModuleList?

2 Likes

I guess that you want to bulid some dynamic network. I think you should read some pytorch code like Resnet, DenseNet.

my issue is that I can’t even loop through them to update them with an update procedure! The .parameters() is empty.

Replace this with self.linear_layers = nn.ModuleList().
Using nn.ModuleList registers the modules’ parameters in your class.

1 Like

what the difference of that (nn.ModuleList()) and torch.nn.ParameterList.

now that I think about it I dont think I even know why we need to register things. I made a question to address it: