How to share weights between 2 model?

I wanted to share weight for both model. As an illustration, I created 2 models: A and B. I want the B to be a part of the A model, in a way that A = B + 2 fully connected layers. Can I just call the B model class again in A model class? Will the B part of A weight trained when I train B separately (not trained in the same graph as A)? Thanks in advance

Yep.

Yep.

If I have correctly understood what you are proposing to do, then yes, PyTorch will handle all the complexities with its usual grace.

Thanks a lot! Could you help me on this issue as well?