Sharing weights

Assume I have a linear layer in PyTorch. For the weight matrix of this layer, can I define only one learnable parameter and then expand this parameter to fit the required matrix size of the layer? so far, I think this is impossible, so I appreciate any tips. I want to reduce the number of learning parameters for this layer to one. It doesn’t matter if in the end, I have the same values for all the values in this matrix.