IndexError: Target is out of bounds

Hey guys, I have a neural network with an output of 23 classes

self.out = nn.Linear(in_features = 180, out_features = 23) 

I’m using cross entropy loss:

loss = F.cross_entropy(predictions, batch_y)

The inputs to the cross entropy function are of shape:

predictions.shape
>> torch.Size([128, 23])

batch_y.shape
>> torch.Size([128])

and I get an out of bounds error:

---> 38         loss = F.cross_entropy(predictions, batch_y) 

IndexError: Target 23 is out of bounds.

However when I change the out_features in the output linear layer of the network from ‘23’ to ‘24’, it resolves the issue.
My labels are not one-hot encoded.
Because I now have an extra output class how will that effect the system?
Why has increasing the number of output neurons resolved the issue in the first place?
Have I missed something important?

Thanks for any advice, I can post the full code on request :slight_smile:

Hi,
Classes must span between 0 to N-1( 22 ). I think your classes span from 1 to 23.
Change batch_y to span between 0 to 22, and keep your output neurons to 23.
I suspect this is the issue.!

1 Like

My targets are categorical data that I have encoded numerically 1 - 23. That is most likely the problem. I will try to fix and see if that works,

Thanks!

yes thats the issue, have a look at this
https://pytorch.org/docs/master/generated/torch.nn.CrossEntropyLoss.html

It needs classes to span between 0 to N-1

1 Like

I’ve marked your first post as solution, thanks for your help @chetan_patil !

You’re welcome @PresidentDoggo !

We have the similar issue but doing this is not solving it, pls help! :frowning: