Calling F.relu() in forward() of custom activation function

Hi,
I am defining a custom activation function, which inherits from Function class and has static methods forward() & backward(). If I call F.relu() inside forward(), will it incur autograd (which is undesirable)?

Thank you!

1 Like

Could you check if wrapping your call to F.relu() inside a with torch.no_grad(): would work?

1 Like

@eqy’s suggestion would also work, but by default gradient calculation is disabled in custom autograd.Functions and you could check it via print(torch.is_grad_enabled()) which will return False inside the forward and backward methods.

2 Likes

@ptrblck & @eqy , thanks for the prompt & helpful reply !

2 Likes