What do you mean by reuse nn.Module and sharing weights?
What you have done above is reusing nn.Module class, by doing simple python inheritance. You have inherited n.Module class and are now making a derived class that contains all of the methods of nn.Module class as well as any new methods you define.
Sharing of weights is something different and depends on architecture that you are making.
What I mean is I instantiated nn.Linear and assigned it to self.linear. In the forward method, I use self.linear twice. Does it mean that I repeated two identical layers here? Sorry for the confusion, just realize it is a bad title…
yes you have repeated the same identical layers, infact you have used the same layer twice. They will have the same weights because at the end of the day they are the same thing.
However: