Layer Naming in the Net Module Class

Hello,
I’m a new user of PyTorch (Actually Python in General).

I was wondering about the definition of the Net Class.
Let’s say all I’m using are Linear Layer, ReLU Layer and DropOut Layer.

The question is whether I should create an object for any layer used by a specific name as following:

class NetModel(nn.Module):
    def __init__(self):
        super(NetModel, self).__init__()
        self.LinearLayer001 = nn.Linear(10, 10, bias = True) 
        self.ReluLayer001      = nn.ReLU()
        self.DropoutLayer001   = nn.Dropout(p = 0.1, inplace = False)
        self.LinearLayer002 = nn.Linear(10, 10, bias = True)
        self.ReluLayer002      = nn.ReLU()
        self.DropoutLayer002   = nn.Dropout(p = 0.1, inplace = False)
        self.LinearLayer003 = nn.Linear(10, 10, bias = True)
        
    def forward(self, x):
        x = self.LinearLayer001(x)
        x = self.ReluLayer001(x)
        x = self.DropoutLayer001(x)
        x = self.LinearLayer002(x)
        x = self.ReluLayer002(x)
        x = self.DropoutLayer002(x)
        x = self.LinearLayer002(x)
        return x

Or can I use any definition as prototype as following:

class NetModel(nn.Module):
    def __init__(self):
        super(NetModel, self).__init__()
        self.LinearLayer = nn.Linear(10, 10, bias = True) 
        self.ReluLayer      = nn.ReLU()
        self.DropoutLayer   = nn.Dropout(p = 0.1, inplace = False)
        
    def forward(self, x):
        x = self.LinearLayer(x)
        x = self.ReluLayer(x)
        x = self.DropoutLayer(x)
        x = self.LinearLayer(x)
        x = self.ReluLayer(x)
        x = self.DropoutLayer(x)
        x = self.LinearLayer(x)
        return x

The reason I asked is that I tried the 2nd approach yet when looking at the number of parameters in the net it seemed they are not growing.

Thank You.

Could anyone assist with this?

Thank You.

For layers without parameters (like ReLU and Dropout) it doesn’t matter. But for layers with parameters (like Linear), the two are very different. The first is a feed-forward net with 300 total parameters. The second is a sort of recurrent network with only 100 parameters. Generally, you want the first. (If you are trying to use recurrent networks you should look at the RNN classes)

I see,
So if there is an instance defined in init, If it is used in the forward step it means it will have the same parameters value in each place it is used.

While for layers which have no parameters it has no difference.

That makes sense!

Thank You.