Change gradients' value of the tensors during backpropagation


I registered a hook on the output of each Relu fauntion in VGG19 to record gradient through backward pass using the following snippet code:

x = module(
x = x.requires_grad_(True)

I need to apply Relu function on the gradient of tensor during the back propagation procedure in which each lower tensor receives only positive gradient from higher tensor.

I really appreciate if you can guide me how I can do this.
Thanks in advance.