Using Conv1d with four dimensional input

I’m using Conv1d to process input which is in the format [batch_size, Nc, Nh, Nw].

m = nn.Conv1d(16, 2, 3, stride=2) 
input = torch.randn(20, 16, 50, 50) 
output = m(input)

It gives the following error:
RuntimeError: Expected 3-dimensional input for 3-dimensional weight 33 16 3, but got 4-dimensional input of size [20, 16, 50, 50] instead

I want to reduce the 50x50x16(height, width, channel) to 50x50x2(height, width, channel). How can this be done? Thanks in advance!!!

1 Like

You are passing a 4-dimensional input, so you should use nn.Conv2d for it.
Also, if you don’t want to reduce the spatial size, you could use a 1x1 kernel or add padding, as currently your setup will yield an output of torch.Size([20, 2, 24, 24]).