Defining conv2 Cin size dynamically

Hi. I want to have one layer of Con2d with which the C_in size is defined after some processing in the forward method and is not mentioned in advanced. How can I do this since I need to declare in my __init__ method the fix size for C_in. is there any way like tensorflow which we can mention Noun which means the size can be mentioned dynamically during run time?

You could use the functional API to define your parameters in the forward method.
Here is a small example using a random number of kernels for the conv layer:

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.conv_weight = None
        self.conv_bias = None
        
    def forward(self, x):
        if self.conv_weight is None:
            nb_kernels = torch.randint(1, 10, (1,))
            self.conv_weight = nn.Parameter(torch.randn(nb_kernels, x.size(1), 3, 3))
            self.conv_bias = nn.Parameter(torch.randn(nb_kernels))

        x = F.conv2d(x, self.conv_weight, self.conv_bias, stride=1, padding=1)
        return x


model = MyModel()
x = torch.randn(1, 3, 24, 24)
output = model(x)
output.mean().backward()

print(model.conv_weight.grad)
print(list(model.parameters()))
1 Like