Share specific weights between two defined layers

I have defined my own layer which contains two sets of weights (w1,w2). Now I want to use my layer and define a model. My model includes two layers, but I need to use the same weight for one set of weights in both layers. How can I do that? Actually, my code is as follow:

class Layer(Module):
    def __init__(self):
        super(Layer, self).__init__()
        self.weight1 = Parameter(torch.FloatTensor(F1, F3))
        self.weight2 = Parameter(torch.FloatTensor(F2, F3))

class Model(nn.Module):
    def __init__(self):
        super(Model, self).__init__()
        self.l1 = Layer()
        self.l2 = Layer()

I want to both l1 and l2 have the same weight2, but I don’t know how to do it.
Thanks