Nan gradients with Torch.angle()

Seeing the torch.angle() description (torch.angle — PyTorch 1.9.1 documentation), it says that the behavior of torch.angle() has been changed since 1.8.0. Following is the note from the link.

====== Note =======
Starting in PyTorch 1.8, angle returns pi for negative real numbers, zero for non-negative real numbers, and propagates NaNs. Previously the function would return zero for all real numbers and not propagate floating-point NaNs.

Here, I don’t understand what it means by ‘propagates NaNs’. I understood it as its gradient will be NaN when the input to Torch.angle() is real value. However, when I tested, it seems its gradient is not Nan.

    x1=torch.tensor([[0, 4, -4, 1+4j]], dtype=torch.cfloat, requires_grad=True)
    out = x1.angle()
    print(out)
    out.mean().backward()
    print(x1.grad)

The corresponding outputs from print are

    tensor([[0.0000, 0.0000, 3.1416, 1.3258]], grad_fn=<AngleBackward>)
    tensor([[ 0.0000+0.0000j,  0.0000+0.0625j, -0.0000-0.0625j, -0.0588+0.0147j]])

If my understanding to the note is correct, the gradient from angle() when its input is real value should be Nan, but it is not.

Could anyone help me understand when torch.angle() returns Nan as its gradient?
It looks like I am facing Nan issue because of angle() function, because when I remove angle(), I don’t see Nan anymore. I am trying to look into this function to see when its gradient becomes Nan.

(Code is tested in pytorch 1.9.0)

“Propagates NaNs” means that this module will output NaN values if the input was already containing NaNs (so it propagates/forwards them). This should apply for the forward and backward methods. Based on the description I would guess that NaN inputs were mapped to zeros in previous versions.