Change gradients' value of the tensors during backpropagation

Hi

I registered a hook on the output of each Relu fauntion in VGG19 to record gradient through backward pass using the following snippet code:

x = module(xx.to(self.device))
x = x.requires_grad_(True)
x.register_hook(self.save_gradient)

I need to apply Relu function on the gradient of tensor during the back propagation procedure in which each lower tensor receives only positive gradient from higher tensor.

I really appreciate if you can guide me how I can do this.
Thanks in advance.