Doubt in backward() implementation in customize activation function

(1) If, i write “output = torch.square(input)” in customize activation forward() in below code, i think it doesn’t require backward() implementation as it used torch function
(2) But if i write “output = input * input”, it requires the implementation of backward(). Is below code is fine for customize function “output = input * input”?

Let me know if i am missing something.

class my_act(Function):

# Note that both forward and backward are @staticmethods
@staticmethod
 def forward(ctx, input, weight, bias=None):
    ctx.save_for_backward(input)
    output = input * input
    return output

@staticmethod
def backward(ctx, grad_output):
   
    input = ctx.saved_tensors
    grad_input = grad_output * ( 2 * x) # DERIVATIVE OF X*X MULTIPLY WITH GRAD_OUTPUT
  
    return grad_input

Hi,

Why are you using a custom Function here?
If you just want the autograd to compute the gradients for you, just put that in a regular python function.

You should only use a custom Function if you use code that is not handled by the autograd or you want something that is not the gradient.

I just want to implement Kernel Density Estimator (KDE) as there is no torch function for that and if i use scikit kde, i need to detach tensor which don’t allow me to gradient flow. So for that i just clearing my doubt with demo program i shown.

I am very thankful if you have some idea regarding KDE. Is there any in built pytorch function OR i need to implement custom function for KDE.

If it required to implement custom function for KDE, have you some idea or link if it is already implemented?

There is no KDE implementation in core that I know of. You might want to google around as someone else might already have implemented that though in a third party lib.

Otherwise, for this, you indeed need a custom Function.
So you will need to make sure you have the backward formula written down for the the KDE evaluation.
And in that case, yes you sample above is correct.

OK. Thanks a lot for your guidance.