Confusion about nested modules and shared parameters

Hi, nice idea.
I also had a similar question, like this kind of embedding sharing between networks also applies to share nn.Sequential modules? Like can nn.Sequential( ) based ‘lambdas’ (with their weights inside nn.Sequential()) be shared across different objects of a class

For e.g. instead of self.embedding, what if I wanted something like:

import torch
import torch.nn as nn

class SubModule(nn.Module): 

    def __init__(self, net_s):
        super(SubModule, self).__init__()
        self.nets = lambda : nn.Sequential(nn.Linear(100,200), nn.ReLU())

    def forward(self, input):
        return self.nets(input)

class Model(nn.Module):
   
    def __init__(self, x, net_s):
        super(Model,self).__init__()
        self.nets1 = SubModule(net_s)
        self.nets2 = SubModule(net_s)`
    def forward(self, input):
        return self.net_a(input) + self.net_b(input)


Do nets1 and nets2 use the same sequential lambda defined in SubModule (with the same weights)??

Or am I missing something?