Two instances of conv sharing same weights

Is it possible that two instances of a convolutional layer in my init method can share same set of weights?

Ex:
self.conv1 = nn.Conv2d(…)

self.conv2 = MycustomConvFunction(…)

So I want self.conv1 and self.conv2 to share same set of weights.
Actually I want self.conv2 for inference and self.conv1 for training.

The most elegant way would probably be the functional API, which would only create a single weight parameter and just use it when it’s needed.
Alternatively, you could assign the weight parameter to your modules as described here.

1 Like

Hi @ptrblck

I saw the link that you attached. Will I have to every time make the weights equal for two instances in the forward method?
Or I can just equate the weights of two nn.Linear instances once and they will share same storage location and weight values for all the epochs.

For example

class testModule(nn.Module):

def __init__(self):
    super(testModule, self).__init__()
    self.fc1 = nn.Linear(5, 10, bias=True)
    self.fc2 = MyLinearLayerModel(10, 10, bias=False)
def forward(self, x, p=False):
    if p = True:
         x = self.fc1(x)
   else:
        self.fc2.weight.data  = self.fc1.weight.data
         x = self.fc2(x)
    return x

Or I can just do “self.fc2.weight.data = self.fc1.weight.data” once inside init?