Cross Entropy Loss for multiclass classification

Hi all,

I have a question I have a multiclass classification problem where each sample belongs only to one class, my targets are in [0, C-1] format.

My question is: as input the cross entropy loss must take logits? so as a last layer I need to have a sigmoid layer?

You can see the doc for what exactly each one is doing: cross entropy, nll, bce and bce with logits should be the ones you want to check.

With cross entropy loss, it is enought to input class labels directly. Also, softmax is not needed for cross entropy loss.