ValueError: expected 4D input (got 3D input) (Different)

I was working with some Pytorch Model when I got this error. At first glance I thought that this issue might have raised because the model Expect (Batch x the_3d_size) for Image. But then I thought that Pytorch have the functionality to directly pass both batched and non-batched data into the inherited classes of nn.Module. Like this

layer = nn.Linear(4 , 5)

non_batched_input = torch.rand(4)


Output = tensor([-0.1435,  0.4662, -0.6626, -0.3902,  0.4086], grad_fn=<ViewBackward0>)

batched_input = torch.rand(5 , 4)


Output = tensor([[-0.5876,  0.2451, -0.7482, -0.1869,  0.0120],
        [-0.2541,  0.2187, -0.5797, -0.1493,  0.4184],
        [-0.3964,  0.1388, -0.5467, -0.0359,  0.2836],
        [-0.2042,  0.5370, -0.6630, -0.4109,  0.6399],
        [-0.2992,  0.1950, -0.6337, -0.1641,  0.0830]],

I also visited the same Issue on the forums. But that was related to batch. My question is why is Pytorch at this point not able to recognize the batched and the non batched-data…?

This is specifically not an Issue as it has been already solved here with the add on of Single LOC

valid_result = model(image_valid.unsqueeze(0))

I don’t understand this question as it seems the outputs are expected.
For the unbatched input you will receive an unbatched output and a batched input will create an output batch. Could you describe what the issue is?

According to this link, we need to give the input to the layer in the format of (Batch x the_size). But what I have worked with Pytorch, it is always able to recognize the difference between Batched-Input and Non-Batched Input(also as you said).

  • So why it this point we need to define an extra dimension for batch before passing to the layer.
  • If it can already create the distinct output for Batched(4D Input) and Non-Batched(3D Input), then why it is throwing a Value Error.
  • If it can process both the 4D Input and 3D Input ,why it expecting 4D Input.

The specific Layer I was getting Issue with was Batch Norm 2d

Sorry for replying Late : )