Is there a loss function for multiclassification in pytorch?

Hello, I am doing textclassification (3 classes) with huggigface library and Would like to know which loss function should use ? I used crossentropyloss but since reading forums , I found categorical cross entropy.

So which one is better ?

Hi Emmanuelle!

The choice will depend on how your data is “annotated.”

If your ground-truth labels (annotations) are integer class labels, then
you would use what I believe you are calling “categorical cross entropy.”

That is, each sample in your data set would be labelled with a single
integer – 0 for class-0, 1 for class-1, and 2 for class-2.

If your labels are probabilistic “soft” labels, then you would use the
more general full cross entropy.

In this case, each sample would be labelled with three floating-point
probabilities, P[c], where P[0] would be the probability of that sample
being in class-0, P[1] the probability of being in class-1, and so on.

As of the latest stable release, 1.10.0, pytorch’s CrossEntropyLoss
supports both categorical and probabilistic labels.

But again, the kind of labels you have determines which version of
cross-entropy loss you should use.

Best.

K. Frank

1 Like