Shared embedding in PyTorch

Dear friends,

How I can share the same embeddings (words or characters) between encoder and decoder?

Just define the embedding first and give it to the encoder and decoder is initial parameter like I did here for example. the important lines are:

self.embedding = nn.Embedding(self.params.vocab_size, self.params.embed_dim)
...
self.encoder = Encoder(device, params, self.embedding)
self.decoder = Decoder(device, params, self.embedding, self.criterion)
1 Like

Thank you Mr @vdw :slight_smile:

Mr vdw is my dad. I’m just Chris :). Happy coding!

2 Likes

Oh, I’m sorry Mr.Chris, and thank you again for your support :slight_smile: