How can I modify a ReLU layer's backward?

thanks for the reply, from this post however it seems like what one has to return is modification of grad_input, where in the snippet in your post, we seem to be returning a modification of grad_out.