How does torch threshold values in Hardtanh function?

I was going through the activation function Hardtanh. The definition involves clipping (or clamping) the values between 1 and -1. How does Torch threshold the values for this operation while still remaining differentiable?

Hi,

It does regular clamping like you would do with the clamp function. And the gradient for the constant part of the function is just 0.
Does that answer your question?

1 Like

you mean torch.clamp?

Yes.
Just like when you have relu with a negative input. It is constant in that region and will return a 0 gradient.