What happen if I reuse nn.Linear?

Hello,

What would happen if I reuse nn.Module? Will the module share the same weights?

A simple model:

class SimpleModel(nn.Module):
    def __init__(self):
        super().__init__()  
        self.linear = nn.Linear(1, 1)

    def forward(self, x):
        output = self.linear(x)
        output = self.linear(output)  # re-using self.linear
        return output

Thanks

3 Likes

What do you mean by reuse nn.Module and sharing weights?
What you have done above is reusing nn.Module class, by doing simple python inheritance. You have inherited n.Module class and are now making a derived class that contains all of the methods of nn.Module class as well as any new methods you define.
Sharing of weights is something different and depends on architecture that you are making.

What I mean is I instantiated nn.Linear and assigned it to self.linear. In the forward method, I use self.linear twice. Does it mean that I repeated two identical layers here? Sorry for the confusion, just realize it is a bad title…

yes you have repeated the same identical layers, infact you have used the same layer twice. They will have the same weights because at the end of the day they are the same thing.
However:

self.linear = nn.Linear(1, 1)
self.linear1 = nn.Linear(1, 1)
...
 def forward(self, x):
     output = self.linear(x)
     output = self.linear1(output)  # re-using self.linear
     return output

linear1 and linear will have different weights. Even though you used the same nn.Linear(1, 1)

2 Likes