What is the difference between log_softmax and softmax?

(yenming) #1

What is the difference between log_softmax and softmax?
How to explain them in mathematics?
Thank you!

1 Like
(Kaiyang) #2

log_softmax applies logarithm after softmax.

softmax:

exp(x_i) / exp(x).sum()

log_softmax:

log( exp(x_i) / exp(x).sum() )

log_softmax essential does log(softmax(x)), but the practical implementation is different and more efficient while doing the same operation. You might want to have a look at http://pytorch.org/docs/master/nn.html?highlight=log_softmax#torch.nn.LogSoftmax and the source code.

6 Likes
(Debajyoti Chatterjee) #3

Can you please link that implementation?
Is it calculated by, x_i - log( exp(x).sum() ) ?

(Kaiyang) #4

The implementation is done in torch.nn.functional where the function is called from c code: http://pytorch.org/docs/master/_modules/torch/nn/functional.html#log_softmax.

(Debajyoti Chatterjee) #5

Is there any way to see the c code?

(Kaiyang) #6
1 Like