Pytorch: a shared conv layer

I have two different networks. The parameters of one of the Conv layers are shared in every space it appears. what should I do? How could I use the parameters of the layer in one network to another network?

you can look at this thread or you can copy the weights of convolutional layer from one to another whenever you make a change to any one of them like model.conv1.weight=model2.conv1.weight