Hey guys, I have a neural network with an output of 23 classes
self.out = nn.Linear(in_features = 180, out_features = 23)
I’m using cross entropy loss:
loss = F.cross_entropy(predictions, batch_y)
The inputs to the cross entropy function are of shape:
predictions.shape
>> torch.Size([128, 23])
batch_y.shape
>> torch.Size([128])
and I get an out of bounds error:
---> 38 loss = F.cross_entropy(predictions, batch_y)
IndexError: Target 23 is out of bounds.
However when I change the out_features in the output linear layer of the network from ‘23’ to ‘24’, it resolves the issue.
My labels are not one-hot encoded.
Because I now have an extra output class how will that effect the system?
Why has increasing the number of output neurons resolved the issue in the first place?
Have I missed something important?
Thanks for any advice, I can post the full code on request