from documentation here : http://pytorch.org/docs/master/nn.html#torch.nn.LogSoftmax, log-softmax is defined as:

```
f(x)=log(softmax(x))
```

as I know, using log in probabilities will change the high-low value, so biggest value in softmax(x) will be the smallest value in log(softmax(x)).

Will it change the way Negative Log Likelihood compute loss when it implements it together in `CrossEntropy`

?