Doubt about weight-sharing between two layers

I am a beginner, I wanted to know how I could share weights between two conv layers in a model. I read some posts about this on the forum but coudn’t understand.

I found this example in the Pytorch tutorials -

https://pytorch.org/tutorials/beginner/examples_nn/dynamic_net.html

Here we can see that the middle_linear layer is being reused in the forward() method.

for _ in range(random.randint(0, 3)):
            h_relu = self.middle_linear(h_relu).clamp(min=0)

So does this mean that after backprop, the weights of all the copies of this layer (middle_linear) will have the same values.

If yes, then is it true that if i want two layers to share the weights, then I can just instantiate the layer once and no matter how many copies I make of that layer, their weights will be the same? For example

def __init__(self):
    self.conv_start = nn.Conv2d(3, 64, kernel_size=3, padding=1)
    self.conv_1 = nn.Conv2d(64,64,kernel_size=3, padding=1)

def forward(self, input):
    out = self.conv_start(input)
    out = self.conv_1(out)
    out = self.conv_1(out)
    out = self.conv_1(out)

return out

After backprop, are the weights for the 3 copies of conv_1 layer differently updated? Or is it just one layer being used multiple times in the dynamic graph, and therefore sharing the weights?

Thank you.

I think this was just :

To showcase the power of PyTorch dynamic graphs

In practice, this has no sense to me. The weights should be updated by the optimizer and typically they are different in each step.

@dejanbatanjac

No no. I know that the weights will be different at each time step, I want to know if the three copies of conv_1 share the weights.