What is the difference between log_softmax and softmax?

How to explain them in mathematics?

Thank you!

# What is the difference between log_softmax and softmax?

**yenming**(yenming) #1

1 Like

**KaiyangZhou**(Kaiyang) #2

log_softmax applies logarithm after softmax.

softmax:

```
exp(x_i) / exp(x).sum()
```

log_softmax:

```
log( exp(x_i) / exp(x).sum() )
```

log_softmax essential does log(softmax(x)), but the practical implementation is different and more efficient while doing the same operation. You might want to have a look at http://pytorch.org/docs/master/nn.html?highlight=log_softmax#torch.nn.LogSoftmax and the source code.

6 Likes

**dchatterjee172**(Debajyoti Chatterjee) #3

Can you please link that implementation?

Is it calculated by, `x_i - log( exp(x).sum() )`

?

**KaiyangZhou**(Kaiyang) #4

The implementation is done in `torch.nn.functional`

where the function is called from c code: http://pytorch.org/docs/master/_modules/torch/nn/functional.html#log_softmax.