# How to call only backward path of pytorch function?

Hi,

I try to implement asymmetric threshold Function:

1. Forward path computes as ordinary threshold, and Backward path computes as a derivative from Sigmoid function.

So, I need to call only backward path of nn.functions.sigmoid in my backward path. How can I do It? I think that it’s will be more faster than self implemented derivative of Sigmoid,

Hi,

To do so, you want to create your own `Function` where you reimplement the sigmoid backward.
It should be fairly easy as it is: `grad_output * (1 - output) * output` where output is the output of the forward pass and grad_output is the grad given as parameter for the backward.

So,

Yes I did it:

``````def where(cond, x_1, x_2):
cond = cond.float()
return (cond * x_1) + ((1-cond) * x_2)

@staticmethod
def forward(ctx, x):
ctx.save_for_backward(x)
_zeros = torch.zeros_like(x)
_ones = torch.ones_like(x)
return where(x > 0, _ones, _zeros)

@staticmethod
x, = ctx.saved_variables
_slope = 100
grad_input = _slope * torch.exp(- _slope * x) / torch.pow((1 + torch.exp(- _slope * x)), 2)
``````

But I got a NaN values in gradients (I think it’s problem related with large slope of sigmoid, but I have not good ideas how to fix it)

I would do it slightly differently:

``````class AsymThreshold(torch.autograd.Function):
@staticmethod
def forward(ctx, *args, **kwargs):
output = torch.nn.functional.threshold(*args, **kwargs)
ctx.save_for_backward(output)
return output

@staticmethod