Total number of parameters with shared weights

Hi,

I try to use shared weights and display correct amount of parameters.
A simple two linera layers network with the same weights.

def __init__(self):
        super(testModule, self).__init__()
        self.fc1 = nn.Linear(10, 10, bias=True)
        self.fc2 = nn.Linear(10, 10, bias=False)

        # Remove the weights as we override them in the forward
        # so that they don't show up when calling .parameters()
        del self.fc1.weight
        del self.fc2.weight

        self.fc2_base_weights = nn.Parameter(torch.randn(10, 10))
        # self.shared_weights = nn.Parameter(torch.randn(10, 5))

        self.fc1.weight = self.fc2_base_weights
        self.fc2.weight = self.fc2_base_weights

Yet I receive 210 parameter’s instead of 110…

----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Linear-1               [-1, 10, 10]             110
            Linear-2               [-1, 10, 10]             100
================================================================
Total params: 210
Trainable params: 210
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.00
Forward/backward pass size (MB): 0.00
Params size (MB): 0.00
Estimated Total Size (MB): 0.00
----------------------------------------------------------------

Thank you in advance!

I get the expected 110 total parameters using the latest version of torchinfo:

class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(10, 10, bias=True)
        self.fc2 = nn.Linear(10, 10, bias=False)

        # Remove the weights as we override them in the forward
        # so that they don't show up when calling .parameters()
        del self.fc1.weight
        del self.fc2.weight

        self.fc2_base_weights = nn.Parameter(torch.randn(10, 10))
        # self.shared_weights = nn.Parameter(torch.randn(10, 5))

        self.fc1.weight = self.fc2_base_weights
        self.fc2.weight = self.fc2_base_weights
        
model = MyModel()
summary(model)
# =================================================================
# Layer (type:depth-idx)                   Param #
# =================================================================
# MyModel                                  --
# ├─Linear: 1-1                            110
# ├─Linear: 1-2                            100
# =================================================================
# Total params: 110
# Trainable params: 110
# Non-trainable params: 0
# =================================================================