The question is already answered several times but I want to share my experience which may help you to think on a practical case
Firstly, I want to mention again nn.Sequential stores some layers which has already implemented forward method where layers are passed in a cascaded way. The point is, you dont always want layers to be cascaded. In my case what I need was concatenating output of CNN Layers having different kernel sizes.
Here is the paper, I tried to implement “Convolutional Neural Networks for Sentence Classification”. For the starting point I found an implementation on github
class CNNSentence(nn.Module):
def __init__(self, args, data, vectors):
super(CNNSentence, self).__init__()
...
for filter_size in args.FILTER_SIZES:
conv = nn.Conv1d(self.in_channels,
args.num_feature_maps,
args.word_dim * filter_size,
stride=args.word_dim)
setattr(self, 'conv_' + str(filter_size), conv)
...
def forward(self, batch):
...
conv_result = [
F.max_pool1d(F.relu(getattr(self, 'conv_' + str(filter_size))(conv_in)),
seq_len - filter_size + 1).view(-1, self.args.num_feature_maps)
for filter_size in self.args.FILTER_SIZES]
out = torch.cat(conv_result, 1)
...
However, I skipped setting the convolutional layers as attribute while rewriting the model. Later, while transferring the network to gpu, I realized that convolutional layer is not in my network since I got an error. My failed code is below:
class CNN_Sentence(nn.Module):
def __init__(self, ..., ngram_filter_sizes=[3, 4, 5], ...):
super(CNN_Sentence, self).__init__()
...
self.convs = []
for ngram_filter in ngram_filter_sizes:
conv = nn.Conv1d(embedding_size,
conv_out_filter,
ngram_filter).to(args.device)
self.convs.append(conv)
...
def forward(self, batch):
...
x = []
for conv in self.convs:
conv_out = conv(batch)
max_pool_kernel = conv_out.shape[2]
conv_out = F.max_pool1d(F.relu(conv_out),
max_pool_kernel)
x.append(conv_out.view(bath_size, -1))
...
I think it is a good example why we need nn.ModuleList and why it is different than nn.Sequential
You can find the entry point discussion of nn.ModuleList also, which helped me to discover the class nn.ModuleList
https://discuss.pytorch.org/t/list-of-nn-module-in-a-nn-module/219