How to do the weight sharing between two classes?

In my architecture, i have to reuse a module with the same weights as its original ones. Does the following method is the right way to realize my thought?

Class Encoder(nn.module):
def init(self):
super(Encoder, self).init()
def forward(self, x1, x2, emb, input_sel):
if input_sel == 0 :
x = x1
x = x2

       x = 。。。。

Class Top(nn.module):
def init(self):
super(Top, self).init()
self.encoder = Encoder()
self.decoder = Decoder()
def forward(input, emb1,emb2):
encoder_output = self.encoder(input,input,emb1,0)
decoder_output = self.decoder(encoder_output)
code_output = self.encoder(decoder_output,decoder_output,emb1,1)
return encoder_output, decoder_output, code_output

Can the two encoders share the same weight in above method??

Hi nkcdy,
Yes, the code that you’ve shared would be one way to have shared weights, and should work for your purpose.