conv2D RuntimeError: Expected 4-dimensional input for 4-dimensional weight [128, 10, 2, 2], but got 2-dimensional input of size [4, 10] instead

I am trying to pass an output of a pretrained classifier model to a bunch of conv2D and linear layers (to get some embeddings eventually), but since the classifier outputs a dim of [4,10], I am having trouble passing this to the conv layers.

So, I have:

   def forward(self, data):
         x = data
         x = self.model_ft(x) # model_ft is the pretrained model; 
         print(x.shape)
        # output of the above is : torch.Size([4, 10])
         x = self.conv1(x) # conv2D defined as: self.conv1 = nn.Conv2d(10, 128, 2)
         x = F.relu(x)
         x = self.pool1(x) # defined as: self.pool1 = nn.MaxPool2d(2)
         x = self.conv2(x) # defined as: self.conv2 = nn.Conv2d(128, 64, 5)

However, when I run this, I get:

RuntimeError: Expected 4-dimensional input for 4-dimensional weight [128, 10, 2, 2], but got 2-dimensional input of size [4, 10] instead

Not sure what it is that the conv2D layer is expecting and I could not really figure this out reading the docs :frowning: - any pointers would be great!

Hi,

The conv2d expects an input of size: (batch, channel, height, width).

Thanks albanD - I should be using conv1D not 2D!