Why my linear layer is expecting 4-dimensional weight?

I am very new to Deep learning. I am working on the CIFAR10 dataset and created a CNN model which is as below.

class Net2(nn.Module):
  def __init__(self):
    super(Net2, self).__init__()
    self.conv1 = nn.Conv2d(3, 32, 5, 1)
    self.fc1 = nn.Linear(32 * 4 * 4, 512)
    self.fc2 = nn.Linear(512,10)


  def forward(self, x):
    x = x.view(x.size(0), -1)
    x = F.max_pool2d(F.relu(self.conv1(x)),(2,2))
    x = F.relu(self.fc1(x))
    x = self.fc2(x)
    return x
    

net2 = Net2().to(device)

My assignment requirements are to create a model with:

Convolutional layer with 32 filters, kernel size of 5x5 and stride of 1.
Max Pooling layer with kernel size of 2x2 and default stride.
ReLU Activation Layers.
Linear layer with output of 512.
ReLU Activation Layers.
A linear layer with output of 10.

Which I guess I wrote. But I am assuming that I am going to the wrong path. Please help me to write the correct model and also the reason behind those arguments in Conv2d and Linear layers.

The error which I am getting from my code is as below:

RuntimeError: Expected 4-dimensional input for 4-dimensional weight [32, 3, 5, 5], but got 2-dimensional input of size [1024, 3072] instead

This reshapes your input into a 2d Tensor, nn.Conv2d requires the input be 4D.

1 Like