Step Activation Function

Is there a step activation function in pytorch? One that returns -1 for values < 0 and 1 for values > 0

You may define this activation function on your own. I think there is a great tutorial on pytorch website.

However, I think your function is not differentiable, so you might have to be careful on using this function.

1 Like

Hello Ziqi and Ayman!

tanh() is a commonly-used differentiable approximation to the
step function, and is sometimes used as an activation function.
(We often call these differentiable approximations “soft” versions
of the functions they approximate.)


K. Frank

Yeah I think using tanh makes sense.