Need help with uneven sized batch and applying five-crop augmentation

I am stuck, my custom classification network fails at the very last batch because of its size.
When I get my batch from data loader with five-crop applies I get the following input shapes.

torch.Size([6, 5, 3, 256, 256]) torch.Size([6])
torch.Size([6, 5, 3, 256, 256]) torch.Size([6])
torch.Size([6, 5, 3, 256, 256]) torch.Size([6])
...
torch.Size([1, 5, 3, 256, 256]) torch.Size([1])

According to the documentation you should collapse the batch_size and ncrops before feeding to the network
so I did input.view(-1, 3, 256, 256), the final output of after convolutions and fully connected layers is a shape of [30, 64, 1, 1], I then flattened the 1 dimensions so I get [30, 64] as result. The next step I did was to average the crops according to the documentation eg result.view(bs, ncrops, -1).mean(1) so I did results.view(6, 5, -1).mean(1) which gave me back a tensor shape of [6, 1]. You can probably tell what the issue is now if you have been following along.

Remeber the last size is:

torch.Size([1, 5, 3, 256, 256]) torch.Size([1])
This will fail when I tried to use result.view(bs, ncrops, -1).mean(1)

How do I fix this? I am self taught please go easy on me.

EDIT: I AM STUPID thats why we do. I thought were just place holders.

bs, ncrops, c, h, w = input.size()