Reusing a conv layer multiple times - weight doesn't change?

Hi I’m running the code snippet below.

class LFBlock(nn.Module):
    def __init__(self, kernel_size=(1,1,1), block_shape=(32,32,32)):
        super(LFBlock,self).__init__()
        
        self.block_shape = block_shape    
        self.conv = nn.Conv3d(1,1, kernel_size=kernel_size, stride=1, padding=0, bias=True)
                
    def forward(self, x):
        
        x1 = x[:,0][:,None]
        x2 = x[:,1][:,None]
        x3 = x[:,2][:,None]
        x4 = x[:,3][:,None]
        
        x1 = self.conv(x1)
        print('x1 | convweight=', self.conv.weight.data)
        x2 = self.conv(x2)
        print('x2 | convweight=', self.conv.weight.data)
        x3 = self.conv(x3)
        print('x3 | convweight=', self.conv.weight.data)
        x4 = self.conv(x4)
        print('x4 | convweight=', self.conv.weight.data)

The print outputs of this code snippet is:

x1 | convweight= tensor([[[[[-0.2251]]]]])
x2 | convweight= tensor([[[[[-0.2251]]]]])
x3 | convweight= tensor([[[[[-0.2251]]]]])
x4 | convweight= tensor([[[[[-0.2251]]]]])

Where the weights of each conv is the same. This is a smaller scale testing of a model where I’m running lots of small convolutions ~16 for a single pass. Instead of defining 16 individual conv layers, I was thinking I could just reuse one. Where am I going wrong?

Hi,

The weight is associated to the Conv2D class instance. So if you reuse one instance, then you reuse its weights.
If you want more than one instance to train different weights, you will need to create more than one instance of the Conv2D class.

4 Likes

So if the weights were random e.g. 0.327 before conv, then after conv it became -0.2251. Then I do something with those weights. And I re-run the conv, wouldn’t it re-initialise with another random value then change based on what you feed it? Even if it were to randomly reinitialise to 0.327, I thought it would turn out different cause my inputs to it are different.

Or does it freeze the kernel weights after a single pass?

For my particular use case, I’m fine with overriding the conv weights cause I just want to use the weights straight after the conv for some equations

Hi,

You should check the doc about nn.Module’s role. That should give you a good introduction of what nn.Module (which is the parent class of Conv2D) are for.
Feel free to ask more questions here afterwards.