Is -log_softmax the same as NLLLoss applied on log_softmax, and if so why is a separate function required ?
No, you won’t get the same result.
nn.NLLLoss
calculates the loss value by reducing the log probability for each sample using the target index.
log_softmax
will just calculate the log probabilities for the complete tensor in the specified dimension.
The reduction type can be specified and is the mean
by default.
So in the default use case, nn.NLLLoss
will return a single loss value, while the output tensor of log_softmax
will have the same shape as the input tensor.
2 Likes