I am working on nuscenes dataset and for one of the output head using cross entropy loss.
input size torch.Size([8, 3, 10, 159, 159])
target size torch.Size([8, 10, 159, 159])
8 - batch size
3 - classes (specific to head)
10 - d1 ( these are overall classes; for each class, we can have 3 values specifically as mentioned above)
159 - d2 (height)
159 -d3 (width)
We have output heads in our model and the model predicts all the attributes for the main 10 different classes.
If I am using a batch size of 1 then everything is working fine.
input [1, 3, 10 159, 159]
target [1, 10, 159, 159]
But when I increase the batch size cross-entropy loss results in the error "RuntimeError: The size of tensor a (8) must match the size of tensor b (10) at non-singleton dimension 1"
I am using torch 1.7.1.
Am I missing something from the definition of cross-entropy loss?
Any suggestion on how I can correct it?