Weight sharing within one weight tensor (nn.Parameter)

Hello all,
I have a weight matrix (for use in a costum model module). In this weight matrix I would like to be able to share weights in a flexible manner (e.g. I provide a list of indices for weights that should be shared). During the training the weights should then receive the same gradient updates and remain the same throughout. Is this possible and if so what approach should I take?

1 Like