Torch.Le() gradient

Can some tell that whether torch calculates gradient for Le operation and how to test that?

The problem is that the gradient of this function is 0 almost everywhere, so they are quite useless.

1 Like

But don’t you think we can’t really find gradient for a<b. It should be none.

Ceil also have gradient zero for which I can access it’s gradient using param.grad but in case of le it returns none.Should I assume that pytorch give none value for gradeint. Also I have found this https://github.com/pytorch/pytorch/blob/7707dee76169a2c2d6637f80af5ed59eeb32a997/torch/csrc/jit/symbolic_script.cpp#L825 .

What’s your call on this?

In the autograd semantic, None and a Tensor full of 0s are the same thing. None is just more memory efficient and a Tensor full of 0 is needed if some part of the tensor might have gradients.