How does Autograd deal with non-differentiable opponents such as abs() and max()?

Mostly some (more or less) arbitrary extension from the intervals is used.
One thing that people seem to like - and PyTorch mostly does - is to have zero derivative if it is zero in a neighbourhood - eg for relu at zero.

Best regards

Thomas

3 Likes