Applying same name for layers not giving different weights

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(2,2)
    def forward(self,x):
        x = self.fc1(x)
        x = self.fc1(x)
        return x

Here is my model, as you can see I have defined fc1 just once and using it in my forward function twice sequentially.
So ideally i must get separate weights for the 2 layers in my forward function, but I am getting weights for just 1 layer. Please help!!!

for name,param in model.named_parameters():
    print(name,param, param.shape)

Output:

fc1.weight Parameter containing:
tensor([[ 0.0108, -0.2179],
        [ 0.5695, -0.1553]], requires_grad=True) torch.Size([2, 2])
fc1.bias Parameter containing:
tensor([ 0.4658, -0.6482], requires_grad=True) torch.Size([2])

I don’t see why this is a problem.
Please have a look at the documentation if you haven’t already done so.

Why should you get two different weights? In your code snippet you are reusing the layer, which only contains one set of parameters.