Adding parameters to nn.Module- what's the best way?

Hi guys!

So I just found out the hard way that if i do something like

self.conv1 = nn.Conv2d(3, 6, 3)

then the weights of the kernels are recognized as a parameter (through net.parameters()) so that the optimizer does go through those weights, while doing the following

self.convNets.append(nn.Conv2d(3,6,3))

Or in other words, having a list of conv2d within your nn.module,
does not register the weights , so the optimizer does not change those weights at all.

My question is, what’s the best way to make sure that those weights do get optimized? Looking at the source code I can see the register_parameter() method within nn.module , but is that the best way?

I was hoping that,since I am using those weights within my forward() pass, the framework will optimize those weights automatically, tbh…

Thanks in advance,
Yoni.

Use nn.ModuleList() instead of the usual Python list. (http://pytorch.org/docs/master/nn.html?highlight=modulelist#torch.nn.ModuleList)

1 Like

Nice,thanks!

Is there a tutorial where I can learn those things systematically?
I’ve gone through the basic 60 mins tutorial,the tutorial by justin and data loading etc but didn’t see that one.

1 Like

I think that’s normal. I’ve been using PyTorch for over two months and only discovered nn.ModuleList() last week :slight_smile:

I don’t think that there is a tutorial that covers everything, because different problems will demand the usage of different functionalities. Anyway, feel free to use this forum. One of the many things that I like about PyTorch is that the community and the developers are very helpful.

1 Like