Function 'PowBackward0' returned nan values in its 0th output

Hi,
by using torch.autograd.set_detect_anomaly(True) Pytorch return this error:

Function 'PowBackward0' returned nan values in its 0th output.

more specifically when I use pow:

_x = torch.where(_x <= 0.008856, 9.033 * _x, 1.16 * (_x+epsilon).pow(1 / 3) - 0.16)

Someone has some ideas of how to avoid this?

1 Like

Hi,

Is it possible for _x to be negative?
Can you double check that _x+epsilon > 0 ?

_x can be negative but the pow is applied only for _x > 0.008856. Right?

I’m afraid, you might see the same problem as https://github.com/pytorch/pytorch/issues/15506

Can you try to run _x = torch.where(_x <= 0.008856, 9.033 * _x, 1.16 * (_x.abs()+epsilon).pow(1 / 3) - 0.16)
This should give the same value but avoid any nan gradients.

4 Likes

You are right. Thanks!!