State_dict does not contain keys for conv layers in list

I have reproduced my issue below.
I have defined my net as a class in

import torch.nn as nn
class Classifier_Module(nn.Module):

    def __init__(self,dilation_series,padding_series):
        super(Classifier_Module, self).__init__()
	    self.conv2d_list = []
	    for dilation,padding in zip(dilation_series,padding_series):
	        self.conv2d_list.append(nn.Conv2d(2048,5,kernel_size=3,stride=1, padding =padding, dilation = dilation,bias = True))

    def forward(self, x):
	    out = self.conv2d_list[0](x)
	    for i in range(len(self.conv2d_list)-1):
	        out = self.conv2d_list[i+1](x)+out
        return out

class Module1(nn.Module):
    def __init__(self):
        super(Module1, self).__init__()
        self.layer = self._make_pred_layer(Classifier_Module, [6,12,18,24],[6,12,18,24])

    def _make_pred_layer(self,block, dilation_series, padding_series):
    	return nn.Sequential(block(dilation_series,padding_series))

    def forward(self, x):
        x = self.layer(x)
        return x

The state dictionary of the net does not contain any keys corresponding to the conv2d layers of the Classifier_Module.

import sample
model = getattr(sample,'Module1')()
print model   # 1 does not show conv2d list
for keys in model.state_dict().keys():
    print keys    #2 does not show con2d list
print model.layer._modules['0'].conv2d_list  # this shows the conv2d list

How can I fix this issue? Is there any other way to perform a similar function without writing code for each conv2d layer?

You need to use nn.ModuleList rather than a plain Python list of modules. This is a very common trap for new users to fall into, and I’m curious if there’s a particular place in the docs/tutorials/etc. where we should add an additional note about it.


Thank you for your quick reply.
I think nn.ModuleList could be mentioned in the 60-minute Blitz tutorial.
I also wanted to know one more thing - Is it okay to use python lists if I want my net to output more than one outputs? I will calculate loss for each of these outputs and them up. Specifically, would autograd work properly when I use python lists to output more than one outputs?

Yes, you can return multiple outputs form a module

1 Like