Manually set gradient of the last layer with binary output

thanks for reply.
N is the normal distribution .
oooh I forget to say that
if max( 0 , x + N(0 , sigmoid(x) ) > 0 output is 1 else output is 0 .
For back propagation,I take the gradient of this function to be 0 when output <= 0 and 1 when output>0 .

I have found a way for add this function and gradient to backprob, but dont know that is right or not.
I used this link : https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html

and changed relu and gradient to my function and its gradient.