What is the difference between log_softmax and softmax?

log_softmax applies logarithm after softmax.

softmax:

exp(x_i) / exp(x).sum()

log_softmax:

log( exp(x_i) / exp(x).sum() )

log_softmax essential does log(softmax(x)), but the practical implementation is different and more efficient while doing the same operation. You might want to have a look at http://pytorch.org/docs/master/nn.html?highlight=log_softmax#torch.nn.LogSoftmax and the source code.

22 Likes