MalumaDev
(MalumaDev)
November 13, 2019, 5:46pm
1
Hi,
by using torch.autograd.set_detect_anomaly(True)
Pytorch return this error:
Function 'PowBackward0' returned nan values in its 0th output.
more specifically when I use pow
:
_x = torch.where(_x <= 0.008856, 9.033 * _x, 1.16 * (_x+epsilon).pow(1 / 3) - 0.16)
Someone has some ideas of how to avoid this?
1 Like
albanD
(Alban D)
November 13, 2019, 6:13pm
2
Hi,
Is it possible for _x
to be negative?
Can you double check that _x+epsilon
> 0 ?
MalumaDev
(MalumaDev)
November 13, 2019, 6:19pm
3
_x
can be negative but the pow is applied only for _x > 0.008856
. Right?
albanD
(Alban D)
November 13, 2019, 6:23pm
4
I’m afraid, you might see the same problem as https://github.com/pytorch/pytorch/issues/15506
Can you try to run _x = torch.where(_x <= 0.008856, 9.033 * _x, 1.16 * (_x.abs()+epsilon).pow(1 / 3) - 0.16)
This should give the same value but avoid any nan gradients.
4 Likes