The most elegant way would probably be the functional API, which would only create a single weight parameter and just use it when it’s needed.
Alternatively, you could assign the weight parameter to your modules as described here.
I saw the link that you attached. Will I have to every time make the weights equal for two instances in the forward method?
Or I can just equate the weights of two nn.Linear instances once and they will share same storage location and weight values for all the epochs.
For example
class testModule(nn.Module):
def __init__(self):
super(testModule, self).__init__()
self.fc1 = nn.Linear(5, 10, bias=True)
self.fc2 = MyLinearLayerModel(10, 10, bias=False)
def forward(self, x, p=False):
if p = True:
x = self.fc1(x)
else:
self.fc2.weight.data = self.fc1.weight.data
x = self.fc2(x)
return x
Or I can just do “self.fc2.weight.data = self.fc1.weight.data” once inside init?