Does the following cause parameter sharing between the conv layers?
self.conv = nn.Conv2d(channels_in, channels_out)
layers = []
for _ in range(num_iters):
layers.append(self.conv)
self.model = nn.Sequential(*layers)
If so, does the following remedy it?
layers = []
for _ in range(num_iters):
layers.append(nn.Conv2d(channels_in, channels_out))
self.model = nn.Sequential(*layers)