Sharing list parameters

Hello.

I will like to know something about how the parameter list is created in the nn module. When I add something like self.F1=nn.Linear(700,1000). Internally the Linear module creates a self.weight and self.bias and register.

However if in my init method I do self.weight two times one overrides the other. For example:
self.w1=torch.ones(10,10)
self.w1=torch.zeros(10,10)

The linear module call always self.weight and this does not happens

Moreover if I create my own class, lets say class FullyConnected(nn.Module) and I create a nn.Linear operator inside, if I create an instance of this class in lets say MyModelClass (which is the model where I am creating my neural network), the parameter from FullyConnected class are not added to MyModelClass.parameter_list(). Even if FullyConnected is created like FullyConnected(MyModelClass).

I am unable to understand this behaviour.

Thanks