I was creating a custom mdl and when I printed the object out I got an empty model:
(Pdb) mdl_sgd
NN (
)
and .parameters()
is empty:
list(mdl_sgd.parameters()) = []
my class looks as follow:
class NN(torch.nn.Module):
def __init__(self, D_layers,act,w_inits,b_inits,bias=True):
super(type(self), self).__init__()
# actiaction func
self.act = act
#create linear layers
self.linear_layers = [None]
for d in range(1,len(D_layers)):
linear_layer = torch.nn.Linear(D_layers[d-1], D_layers[d],bias=bias)
self.linear_layers.append(linear_layer)
is there a reason that this does not work?
sinhasam
(Samarth Sinha)
August 9, 2017, 1:22am
2
Try something like this:
class NN(nn.Module):
def __init__(self):
super(NN, self).__init__()
linear_layers = []
for i in range(10):
linear_layers.append(nn.Linear(5,5))
self.net = nn.Sequential(*linear_layers)
1 Like
sinhasam
(Samarth Sinha)
August 9, 2017, 1:24am
3
Also, another easier way to look at your weights would be to do this (in a REPL of some sort):
model = NN()
model.state_dict()
I guess that you want to bulid some dynamic network. I think you should read some pytorch code like Resnet, DenseNet.
my issue is that I can’t even loop through them to update them with an update procedure! The .parameters()
is empty.
vabh
(Anuvabh)
August 9, 2017, 5:05pm
7
Replace this with self.linear_layers = nn.ModuleList()
.
Using nn.ModuleList
registers the modules’ parameters in your class.
1 Like
what the difference of that (nn.ModuleList()
) and torch.nn.ParameterList
.
now that I think about it I dont think I even know why we need to register things. I made a question to address it:
I was trying to make a custom nn module and I was having issues registering variables. I have been kindly pointed to nn.ModuleList() and torch.nn.ParameterList. However, I think I don’t understand in the first place, why do I need to “register” parameters? Whats the point of all this?