Pass variable to custom tanh function

Hi, i am making custom activation function here.
I want to pass a variable(which is pre-defined outside of function) to the backward function.

Here is the code.
But it says

AttributeError: ‘custom_tanh_variableBackward’ object has no attribute ‘variable’

So, i don’t know what to do.
Help me.

self.activation = custom_tanh_variable(variable).apply


class custom_tanh_variable(torch.autograd.Function):

    def __init__(self,ratio_non_sparsity):

        super().__init__()
        self.variable= variable

    @staticmethod
    def forward(ctx,input):
        minus_input = input*(-1)
        tmp1 = torch.exp(input) - torch.exp(minus_input)
        tmp2 = torch.exp(input) + torch.exp(minus_input)
        ans  = torch.div(tmp1,tmp2)
        ctx.save_for_backward(ans)
        return ans

    @staticmethod
    def backward(ctx,grad_output):

        input, = ctx.saved_tensors
        tmp = ctx.variable
        print(tmp)

        return grad_output*(1-torch.square(input))

You are using staticmethods so would have to pass the variable to the forward and/or backward method. If you need to register a parameters/buffer etc. create a custom nn.Module, register the data there, and call the custom autograd.Function in the module’s forward.

@ptrblck Hi, having good weekend?

  1. do you want me to do somehow like this(Extending PyTorch — PyTorch 1.11.0 documentation) ?

  2. I cannot understand the ‘register’? you mean like this (self.variable= variable)?

Thank you sir.

  1. Yes, the tutorial explains how to write a custom nn.Module which you could use as a template to implement yours.

  2. Yes, assuming variable is an nn.Parameter. If it’s a buffer (i.e. a non-trainable tensor which should be used in the forward and stored in the state_dict) use self.register_buffer.

1 Like

Thank you @ptrblck .

One last question. In here(Extending PyTorch — PyTorch 1.11.0 documentation), there is no backward function in nn.Module. i want to use custom backward function, as i originally did on autograd.function in my first post on this thread.

So, can i write def backward in nn.Module and call backward in autograd.function ?

Or i don’t have to do write def backward, because the nn.Module automatically build ‘backward’ from my own-written autograd.function(which is custom backward as my original post on this thread)?

No need to write a custom backward function in nn.Module since you’ve already defined the custom backward in the autograd.Function. PyTorch is smart enough to call this backward method if you use your autograd.Function via .apply in nn.Module.forward.

Hi, @ptrblck

I had tried various way. But i couldn’t find the way.
Here is the code that i tried.

I think I still don’t know how to pass variable to backward function. :frowning:

class Module_custom_tanh(nn.Module):
def init(self,val):
super(Module_custom_tanh,self).init()
self.val= val

def forward(self,input):
    return custom_tanh.apply(input,self.val)
 class custom_tanh(torch.autograd.Function):
     def __init__(self):
         super().__init__()

    @staticmethod
    def forward(ctx,input,val):
        minus_input = input*(-1)
        tmp1 = torch.exp(input) - torch.exp(minus_input)
        tmp2 = torch.exp(input) + torch.exp(minus_input)
        ans  = torch.div(tmp1,tmp2)
        ctx.save_for_backward(ans)
        ctx.register_buffer('val',val)
        return ans

    @staticmethod
    def backward(ctx,grad_output):
        input, = ctx.saved_tensors
        val= ctx.val
        print(val)
        return grad_output*(1-torch.square(input))

but this gets me AttributeError: 'custom_tanh_non_sparsityBackward' object has no attribute 'register_buffer'

autograd.Functions are stateless and cannot register buffers.
If you want to register self.val as a buffer or parameter, do so in the nn.Module (i.e. Module_custom_tanh).
In your custom_tanh pass all needed tensors to the forward and use ctx.save_for_backward if you need specific tensors in the backward function.

1 Like

Plus, after resolving this issue How to mark argument as nondifferentiable in a custom autograd Function, I am good :slight_smile:

thank you sir @ptrblck.