Target 7 is out of bounds

I uploaded the out of bounds error picture here.I used cross entropy loss and running my model on rafdb dataset which contain 7 labels.

` epoch = 0

# Train the model

total_step = len(train_loader)

curr_lr = learning_rate

for epoch in range(num_epochs):

    for i, (images, labels) in enumerate(train_loader):

        #images =

        #labels =



        # Forward pass


        outputs = model(images)



        loss = criterion(outputs, labels)


        # Backward and optimize




It is showing error at loss function.Thanks in advance for the help.

Hi Jami!

CrossEntropyLoss expects class labels that run from
{0, 1, ..., nClass - 1} (not from 1 to nClass).

So in your case, 6 is the largest legal class label, and 7 is out
of bounds.

By the way, please don’t post screen-shots of textual information.
Doing so breaks screen readers (accessibility), search, and copy-paste.


K. Frank

Hi KFrank,
Thanks a lot for the reply,What to do then,my class labels are starting from 1 and i have class labels from 1 to 7.
And also i can’t see any out of bounds error picture here(which you mentioned you uploaded it).
I posted screen shot because it will be clear about the error.May be next time on wards i will post whole error once by copy pasting it.

Hi Jami!

Remember, class labels are really just arbitrary labels that distinguish
one class from another, but nothing else. They have no numerical

So just subtract 1 from all of your labels to get a set that runs from
0 to 6. You could either do this by preprocessing your data, or you
could do it on the fly.

Sorry about that. I was quoting you (rather than referring to an upload
of my own), but I messed up the forum quote block. (I’ve edited my
post to fix the quoting and clean that up.)

Thanks, that would be great. Makes for a more functional forum.


K. Frank

Thanks a lot KFrank,Actually that idea of class labels from 0 to 6 worked for me.