Does NLLLoss handle Log-Softmax and Softmax in the same way?

so biggest value in softmax(x) will be the smallest value in log(softmax(x)).

The softmax function returns probabilities between [0, 1].
The log of these probabilities returns values between [-inf, 0], since log(0) = -inf and log(1) = 0.
That is why the order won’t change.

However, you should use the NLLLoss with a log_softmax output
or CrossEntropyLoss with logits if you prefer not to add an extra log_softmax layer into your model.

19 Likes