LogSoftmax vs Softmax

I’ve discovered a mystery of the softmax here.
Accidentally I had two logsoftmax - one was in my loss function ( in cross entropy). Thus, when I had two logsoftmax, the logsoftmax of logsoftmax would give you the same result, thus the model was actually performing correctly, but when I switched to just softmax, then it was messing up the numbers.
Thanks again to all for the explanations.