Weight sharing: two model objects using a reference to the same parameters

Hi,

I would like to know if it’s possible to share a reference to the same parameters between two models.

This is not for traditional weight sharing, the purpose of it is to overcome a limitation from torch.fx when the model have different behaviours depending on it’s training mode.

Basically I am a framework for automatic model surgery, however when tracing a model the resulting graph only captures the mode it was ran on and therefore loses the original model behaviour.
So the idea was to trace and operate surgery in both mode, and then wrap the two into a module that will call the appropriate one depending on its training mode.
The problem is that I need the two modules to share the same weight.

One solution would be to force transferring the weights of the training version to the evaluation one every time the mode is being changed to evaluation.
However this would be a lot cleaner if I could just have the two models using a reference to the same weights.

Is it even possible?

Cheers