RuntimeError: Function 'LogBackward0' returned nan values in its 0th output

I just slightly changed SoftMax to make sure that if input is 0, then output will be 0 as well.
1644022867(1)
but I got the error image from the loss function:
RuntimeError: Function ‘LogBackward0’ returned nan values in its 0th output.

Is that something wrong with the epsilon settings?

What’s your input tensor? With large enough value you can easily reach +inf for some element due to ~e**2x, which will lead to nan after division

Can you share a minimal reproducible example? I can see your function could failing in the case of eps=0 and your x Tensor containing a zero, as you’d be dividing by zero.

Sorry guys, check my new topics I have put the main error codes there

If you found a solution to this problem, please do share (and mark) the solution so others may learn from it! :slight_smile:

I am sure I will. Thanks for suggestions