https://discuss.pytorch.org/t/runtimeerror-given-groups-1-weight-of-size-64-3-7-7-expected-input-3-1-224-224-to-have-3-channels-but-got-1-channels-instead/30153/28

Reported this error while I was running, can you help me fix it
D:\Python3.8.6\lib\site-packages\torchvision\transforms\transforms.py:287: UserWarning:
Argument interpolation should be of type InterpolationMode instead of int. Please, use I
nterpolationMode enum.
warnings.warn(

generator parameters: 902346

discriminator parameters: 5215425

0%| | 0/13 [00:05<?, ?it/s]
Traceback (most recent call last):
File “train.py”, line 79, in
fake_img = netG(z)
File “D:\Python3.8.6\lib\site-packages\torch\nn\modules\module.py”, line 1102, in _cal
l_impl
return forward_call(*input, **kwargs)
File “C:\Users\63288\Desktop\CF\SRGAN-master\model.py”, line 91, in forward
x = self.mct1(x)
File “D:\Python3.8.6\lib\site-packages\torch\nn\modules\module.py”, line 1102, in _cal
l_impl
return forward_call(*input, **kwargs)
File “C:\Users\63288\Desktop\CF\SRGAN-master\model.py”, line 17, in forward
out1 = self.conv1(x)
File “D:\Python3.8.6\lib\site-packages\torch\nn\modules\module.py”, line 1102, in _cal
l_impl
return forward_call(*input, **kwargs)
File “D:\Python3.8.6\lib\site-packages\torch\nn\modules\conv.py”, line 446, in forward
return self._conv_forward(input, self.weight, self.bias)
File “D:\Python3.8.6\lib\site-packages\torch\nn\modules\conv.py”, line 442, in _conv_f
orward
return F.conv2d(input, weight, bias, self.stride,
RuntimeError: Given groups=1, weight of size [64, 64, 1, 1], expected input[64, 3, 22, 2
2] to have 64 channels, but got 3 channels instead

self.conv1 expects an input with 64 channels while your activation has 3 channels.
Check where this conv layer is used and either change its in_channels or make sure the expected input is passed to it.

Can you please make it clearer? Or can you tell me what to change to, because I’m just starting to learn convolution and don’t quite understand it yet

Your netG contains a module called self.mct1, which then calls into self.conv1 with an input in the shape [64, 3, 22, 22] while it expects an input with 64 channels.
Here is a small code snippet showing the error:

# fails
conv1 = nn.Conv2d(in_channels=64, out_channels=64, kernel_size=1)
x = torch.randn(64, 3, 22, 22)
out = conv1(x)
# RuntimeError: Given groups=1, weight of size [64, 64, 1, 1], expected input[64, 3, 22, 22] to have 64 channels, but got 3 channels instead

# works
conv1 = nn.Conv2d(in_channels=3, out_channels=64, kernel_size=1)
x = torch.randn(64, 3, 22, 22)
out = conv1(x)

Still an error, I changed part of the code:
class MCT(nn.Module):
def init(self, channels):
super(MCT, self).init()
# 使用 3 个卷积层代替单个卷积层
self.conv1 = nn.Conv2d(in_channels=3, out_channels=64, kernel_size=1, stride=1, padding=4)
self.conv2 = nn.Conv2d(in_channels=3, out_channels=64, kernel_size=3, stride=1, padding=4)
self.conv3 = nn.Conv2d(in_channels=3, out_channels=64, kernel_size=5, stride=1, padding=4)
Another error was reported:
UserWarning: Argument interpolation should be of type InterpolationMode instead of int. Please, use Interpola
tionMode enum.
warnings.warn(

generator parameters: 765706

discriminator parameters: 5215425

0%| | 0/13 [00:05<?, ?it/s]
Traceback (most recent call last):
File “train.py”, line 79, in
fake_img = netG(z)
File “D:\Python3.8.6\lib\site-packages\torch\nn\modules\module.py”, line 1102, in _call_impl
return forward_call(*input, **kwargs)
File “C:\Users\63288\Desktop\CF\SRGAN-master\model.py”, line 92, in forward
x = self.mct1(x)
File “D:\Python3.8.6\lib\site-packages\torch\nn\modules\module.py”, line 1102, in _call_impl
return forward_call(*input, **kwargs)
File “C:\Users\63288\Desktop\CF\SRGAN-master\model.py”, line 31, in forward
out = torch.cat((out1, out2, out3), dim=1)
RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 30 but got size 28 for tensor number 1 in the list.

The new error is:

out = torch.cat((out1, out2, out3), dim=1)
RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 30 but got size 28 for tensor number 1 in the list.

so make sure all tensors have the same shape except in dim1.

So how do I change it? Do you have any good suggestions?

I consulted a lot of information, I still will not fix this error, if you have time, please help me change and answer it, thank you very much!