Customizing activation backward() with variable threshhold value?

Hey there,

first of all nice Framework!
To the core problem, I got a non differentiable activation function at 0 which depends on a threshold value. The activation function maps tensor values onto 0 if their absolute value is below threshold alpha.
I have implemented a custom class inheriting torch.Function with both static methods forward() and backward() both depend on alpha.

Since I want to run multiple experiments with different alpha values I’m looking for a convenient way to just pass the threshhold value and modify it. Any ideas?

Thanks in advance