Module List Activation Function

Hello,

I am trying to create a method that allows me to create a nn of variable number of layers and size. As such, I am using a module list.
However, I notice that when I used “nn.Linear” before using a module list, I would have to specify Sigmoid in between layers and soft max at the end. The soft max I can just put at the end, but how do I put a sigmoid layer between each module within the module list?

Thank you so much!

Hi James!

I’m not entirely sure what you’re asking or how you intend to use
your ModuleList, but note that a torch.nn.ReLU is a Module
so you can include it in your ModuleList, in between, for example,
some torch.nn.Linears.

Best.

K. Frank

1 Like

That makes sense, thank you Frank! I was originally trying to create both the ReLu and linear layer when initializing the module list, but found this method more helpful if anyone is curious:

self.fc2 = nn.ModuleList()

for _ in range(initHiddenWeights.shape[0]):
self.fc2.append(nn.Linear(initHiddenWeights.shape[1], initHiddenWeights.shape[2]))
self.fc2.append(nn.ReLU())