I am trying to backpropagate through a torch.heaviside function, but it seems that it disables the gradient for its output. How can I have it in a backpropagation?

If it is is non-differentiable, is there any function that produces 1,0 only?

I am trying to backpropagate through a torch.heaviside function, but it seems that it disables the gradient for its output. How can I have it in a backpropagation?

If it is is non-differentiable, is there any function that produces 1,0 only?

Hi Yegane!

You can’t use the actual Heaviside function in backpropagation.

Use instead a “soft,” usefully-differentiable approximation to it

such as `torch.sigmoid()`

.

There is no such function that would be useful for backpropagation.

Such a function would have zero derivative in regions where it was

either zero or one and would have undefined derivative at those

points where it changed from zero to one. In neither case would

you get any non-trivial backpropagation.

Best.

K. Frank