RuntimeError: Given groups=1, weight of size [256, 256, 3, 3], expected input[64, 1, 256, 256] to have 256 channels, but got 1 channels instead

I got

RuntimeError: Given groups=1, weight of size [256, 256, 3, 3], expected input[64, 1, 256, 256] to have 256 channels, but got 1 channels instead

I think it is because I tried to use the output of self.net which has a (256x256x1) shape to feed into self.layer1 which needs a (256x256) input. But I can’t find out any method to help me to do it. Could anyone please tell me how to handle this problem? I very much appreciate it.
Excuse my ignorance. Thank you all!

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.net = torch.hub.load('mateuszbuda/brain-segmentation-pytorch', 'unet', in_channels=3, out_channels=1, init_features=32, pretrained=True)
        # => 256 x 256 x 1
        self.layer1 = nn.Sequential(
            nn.Conv2d(256, 256, 3,padding =1),
            nn.Conv2d(256, 256, 3,padding =1),
            nn.Conv2d(256, 256, 1,padding =1),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=3, stride=1, padding=1)
        )
        self.layer2 = nn.Sequential(
            nn.Conv2d(128,128,3,padding =1),
            nn.Conv2d(128,128,3,padding =1),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size = 3, stride = 1, padding = 1)
        )
        self.layer3 = nn.Sequential(
            nn.Conv2d(64,64,3,padding =1),
            nn.Conv2d(64,64,1,padding =1),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size = 3, stride = 1, padding = 1)
        )
        self.layer4 = nn.Sequential(
            nn.Linear(1024,1),
            nn.Sigmoid()
        )
        
    def forward(self,x):
        x = self.net(x)
        x = self.layer1(x)
        x = self.layer2(x)
        x = self.layer3(x)
        x = self.layer4(x)
        return x

Try to change this:

self.layer1 = nn.Sequential(
            nn.Conv2d(256, 256, 3,padding =1),
            nn.Conv2d(256, 256, 3,padding =1),
            nn.Conv2d(256, 256, 1,padding =1),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=3, stride=1, padding=1)
        )

Into this:

self.layer1 = nn.Sequential(
            nn.Conv2d(1, 256, 3,padding =1),
            nn.Conv2d(256, 256, 3,padding =1),
            nn.Conv2d(256, 256, 1,padding =1),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=3, stride=1, padding=1)
        )

The first nn.Conv2d layer in self.layer1 expected 256 channels as input ( nn.Conv2d(256, 256, 3,padding =1), hence, the error. Change it to nn.Conv2d(1, 256, 3,padding =1) should resolve the error.

1 Like

Thank you for your replying. It solved my problem

1 Like