I am working with binary data in the generative network and want to generate binary value in the last layer .
I have added a layer noisy rectified linear unit with :
max( 0 , x + N(0 , sigmoid(x) )
in the last layer in order to convert values to (0,1) .
I dont know do how set gradient for this layer and add to back propagation?
please help me.
How did you define the
If I’m not mistaken, this line of code would not normalize the output to
[0, 1], but would clip it at 0 and add probably noise to all positive values?
N is implemented using PyTorch operations, Autograd would probably be able to create the backward pass for you.
thanks for reply.
N is the normal distribution .
oooh I forget to say that
if max( 0 , x + N(0 , sigmoid(x) ) > 0 output is 1 else output is 0 .
For back propagation,I take the gradient of this function to be 0 when output <= 0 and 1 when output>0 .
I have found a way for add this function and gradient to backprob, but dont know that is right or not.
I used this link : https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html
and changed relu and gradient to my function and its gradient.
autograd.Function should work.
You could use
gradcheck to verify the gradients, if it’s applicable for your method.
thanks. I will search to use from 'gradcheck '.