[Bug Report]Unexpected result torch.abs(x - y) >= 0 while x == y

The following code really confuses me , the cond = n < beta is never true, However when I do backward pass input grad is NaN, I believe this is because 0.5 * n**2 / beta, I would like to know how is this possible ? Thanks

>>> import torch 
>>> torch.__version__
'1.8.0+cpu'
>>> input = torch.tensor([[-0.6205, 0.225],[-0.8916, 0.4492]], requires_grad = True) 
>>> target = torch.tensor([[-0.6205, 0.225],[-0.8916, 0.4492]]) 
>>> beta = 0 
>>> n = torch.abs(input - target) 
#Since input == target so the cond is never true.
>>> cond = n < beta 
>>> loss = torch.where(cond, 0.5 * n**2 / beta, n - 0.5 * beta)
>>> loss = loss.sum() 
>>> loss.backward() 
>>> input.grad
tensor([[nan, nan],
        [nan, nan]])

I believe that this is a known effect of the implementation of torch.where: Gradients of torch.where

1 Like