Do I need to apply a softmax when using nn.CrossEntropy()

I am working on multiclass classification problem so the natural choice in pythorch is nn.CrossEntropy, but the docs indicates that the loss function include a softmax function I got confused, I was wondering if applying the softmax at the end of my neural network model will make it used twice instead of only 1 use ?

No.

Note that this case is equivalent to the combination of LogSoftmax and NLLLoss.

1 Like