Given groups=1, weight of size [256, 2048, 1, 1], expected input[2, 960, 34, 46] to have 2048 channels, but got 960 channels instead

Hi all,

I’m trying to train the deeplabv3_mobilenet_v3_large model (pretrained). When, I do model(imgs) with imgs of size: torch.Size([2, 3, 544, 728]) thus batch of size 2, RGB and image dimensions. I get this error: Given groups=1, weight of size [256, 2048, 1, 1], expected input[2, 960, 34, 46] to have 2048 channels, but got 960 channels instead.
Why is this error raised ? I don’t have this issue with the other deeplablab models.

Thanks in advance for the help!

I cannot reproduce the issue with the given input shape:

model = models.segmentation.deeplabv3_mobilenet_v3_large()
x = torch.randn(2, 3, 544, 728)
out = model(x)
print(out['out'].shape)
# > torch.Size([2, 21, 544, 728])

Hey, did you solved this problem? I also got the same issue.

I started with:

def deeplabv3_mobilenet(output_classes):
model = models.segmentation.deeplabv3_mobilenet_v3_large(weights=“DeepLabV3_MobileNet_V3_Large_Weights.COCO_WITH_VOC_LABELS_V1”)
model.classifier = DeepLabHead(960, output_classes)
return model

model = deeplabv3_mobilenet(9)
img = d1[0][“image”].unsqueeze(dim =0)
print(img.shape)
//torch.Size([1, 3, 1022, 1820])

Then, when I used

model(d1[0][“image”].unsqueeze(dim = 0))

I am also receiving the same error:
RuntimeError: Given groups=1, weight of size [256, 2048, 1, 1], expected input[1, 960, 64, 114] to have 2048 channels, but got 960 channels instead

Thanks in advance. Sorry for any typo, this is my first post here.

I cannot reproduce the reported issue and see another error instead for an input of [1, 3, 1022, 1820]:

output_classes = 10
model = models.segmentation.deeplabv3_mobilenet_v3_large()
model.classifier = models.segmentation.deeplabv3.DeepLabHead(960, output_classes)

x = torch.randn(1, 3, 1022, 1820)
out = model(x)
# ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 256, 1, 1])

which indicates a failure in a batchnorm layer as it cannot compute the stats from a single value.
Increasing the batch size works for me:

x = torch.randn(2, 3, 1022, 1820)
out = model(x)
print(out["out"].shape)
# torch.Size([2, 10, 1022, 1820])