Save/load model with shared embeddings weights

I have a model where several different embeddings layers are used, and these layers are grouped by the shared weights. E.g. I have 5 embeddings layers, A, B, C, D, E of which A and B share one weight matrix and C, D, E share another weight matrix.

I create those embeddings layers using the _weight parameter, assigning my own initial weights (which are stored in one of my own class instances).

Now please help me understand what happens when I save and later load the full module that contains all these embeddings layers, using torch.save(module,filename) and then torch.load(filename): will the weights for the layers still get loaded only once for A, B and once for C, D, E and properly shared?

What happens if the parameters of embedding layers with shared weights get transferred to the GPU and then back, will the sharing still be done properly?

If the numpy array for the weights comes from my own object, who is responsible for loading the initial copy which then will get shared? In other words, if I pickle my own object independently, is there a way to still get my restored weights shared between the embedding layers?