Hi,

I know below error is discussed many times but i am still not able to figure out.

The error i am getting is :

RuntimeError: Given groups=1, weight of size [12, 6, 5, 5], expected input[100, 1, 32, 32] to have 6 channels, but got 1 channels instead

below is my code:

class Network(nn.Module):

def **init**(self):

super(Network, self).**init**()

self.conv1 = nn.Conv2d(in_channels=1, out_channels= 6, kernel_size=5)

self.conv2 = nn.Conv2d(in_channels=6, out_channels=12, kernel_size=5)

```
self.fc1 = nn.Linear(in_features=12*4*4, out_features=120)
self.fc2 = nn.Linear(in_features=120, out_features=60)
self.out = nn.Linear(in_features=60, out_features=10)
def forward(self, t):
#input layer
t = t
#hidden conv layer1
x = F.relu(self.conv1(t))
x = F.max_pool2d(x,kernel_size= 2, stride=2)
# hidden conv layer2
x = F.relu(self.conv2(t))
x = F.max_pool2d(t,kernel_size= 2, stride=2)
#hidden linear layer
x = x.view(-1, 12 * 4 * 4)
x = F.relu(self.fc1(x))
x = F.relu(x)
#hidden linear layer
x = self.fc2(x)
x = F.relu(x)
x = self.out(x)
#return F.log_softmax(x, dim=1)
return x
```

I am converting the RGB image to grayscale by using below lines:

transform = transforms.Compose(

[**torchvision.transforms.Grayscale**(num_output_channels=1), transforms.ToTensor(),

transforms.Normalize([0.5],[0.5])])

Not sure where the problem is, can someone point it out to me?