[solved] Have tied weights for 2 conv layers

I have 2 conv layers and want their weights to be the same [tied]. How should I implement it in pytorch.

x =self.relu(self.conv1_1(x))
x_bn = self.bn1(x)
y =self.relu(self.conv1_2(y,weight=self.conv1_1.weight))

This was returning error!
I want y to use the weights of x.

EDIT
I guess using the same conv layer would work!

1 Like