Calling a layer multiple times will produce the same weights?

Hi, I am not sure if we call a layer define in the __init__ for multiple times, do they share weights in the traning? For example, we have a function fc1 defined as:

def __init__(self):
        super(Net, self).__init__()
        self.fc1 = nn.Linear(10,10)
        ...

Then I called it multiple times:

def forward(self, x1,x2):
        x1 = self.fc1(x1)
        x2 = self.fc1(x2)
        ...

I wonder during training, whether or not these layers maintain the same weights? If not, how do we make them share weights? Thanks a lot!

2 Likes

Yes, as you call the same layer, the same underlying parameters (weights and bias) will be used for the computation.

1 Like

Similar question if you want to read more How to create model with sharing weight?