Avoid parameter sharing in for-loop

Does the following cause parameter sharing between the conv layers?

self.conv = nn.Conv2d(channels_in, channels_out)

layers = []
for _ in range(num_iters):
    layers.append(self.conv)

self.model = nn.Sequential(*layers)

If so, does the following remedy it?

layers = []
for _ in range(num_iters):
    layers.append(nn.Conv2d(channels_in, channels_out))

self.model = nn.Sequential(*layers)

In the first code snippet you will use the same conv layer num_iters times, so yes the parameters will be shared (or rather reused).
The second code snippet created num_iters new conv layers, each containing new parameters.

1 Like

Great, thank you so much!