Dynamic input channels for layers

Hello everyone.
I want to create a transpose upsampling layer like this

class TransposeX2(nn.Sequential):
    def __init__(self, in_channels, out_channels):
        layers=[
            nn.ConvTranspose2d(in_channels, out_channels, kernel_size=4, stride=2, padding=1, groups=out_channels),
            nn.BatchNorm2d(out_channels),
            nn.ReLU(),
        ]
        super().__init__(*layers)

but i also want to use this layer multiple times. What should I do to avoid creating this layer on every upsample operation like

self.upsample1=TransposeX2(4,8)
self.upsample2=TransposeX2(8,8)
def forward(self, x):
    a = self.upsample1(x)
    b = self.upsample2(a)

I want to create upsample layer once and then use it everytime like this:

self.upsample=TransposeX2()
def forward(self, x):
    a = self.upsample(x)
    b = self.upsample(a)

Is it possible to create layer without setting input_channels, but make layer accept any input channels.

I’ve tried to do this as shown below, but i think thats not a proper way to do this, and also onnx convertion show errors, when I create layers like this

def forward(self, x):
    ic = x.shape[1]
    oc = (x.shape[1]//2)
    x = nn.ConvTranspose2d(ic, oc, kernel_size=4, stride=2, padding=1, groups=oc).to(x.device)(x)
    x = nn.BatchNorm2d(oc).to(x.device)(x)
    x= nn.ReLU().to(x.device)(x)
    return x

I’ve also tried functional api F.conv_transpose2d, but it does wierd stuff (or I did smth wrong) and onnx conversion does not accept this too.

Thank you for help!

Fundamentally I don’t think it is possible for a single conv2d transpose layer to accept any number of input channels, as the size of the weights depends on this parameter! What does it mean when we need more weights than we have, or when we want to ignore some existing weights due to the variation in input channels?

Instead, a workaround you might try is to simply define some number of conv2d_transpose layers ahead of time and to dispatch to the correct one depending on current shape.
For example:

def forward(self, x):
    if x.size(0) == size1:
        x = layer1(x)
    elif x.size(0) == size2:
        x = layer2(x)
...
1 Like

Thanks, it solved my problem perfectly