Partial weight sharing

Hi, I want to write a weight share between an nn.Embedding(vocab, d) module and an nn.Linear(2d, vocab) with the two halves of the weight matrix of nn.Linear the same and equal to the weight matrix of nn.Embedding.

Is it possible to do so? By weight share I mean not just at initialization but for the whole training.

Many thanks.

P/s: I would like to also ask (to confirm) if doing

self.linear.weight = self.embedding.weight

in the model initialization is enough for weight sharing for the whole training or is it just at initialization?