Autograd.function with parameter

Hello there,

I am trying to implement a custom activation function (similar to relu) with defined forward/backward static methods.
Now when I call Function.apply the following code throws errow self.param1 not found.

pseudo-code:

class F1(Function):

 def __init__(self, PARAM):
       self.param

 @staticmethod
def forward(ctx, input):
    output = output * (output > self.PARAM1)
    return output

@staticmethod
def backward(ctx, grad_output):
    grad_input[input<= self.PARAM1] = 0
    return grad_input

I would like to pass a treshold parameter upon which input values are filtered…
any ideas? thanks in advance

1 Like

Found the solution in forum…