How to modify the computed gradient of middle variables and then backpropagate new grandients

When I traning a model, I want to change some non-leaf node’s gradient and see how this affect the model’s behavior. For example:
The gradient backpropagates chain is z->y->x,
I wanna get the y.grad and set it to new values,then backpropagating to x
the x.grad should be affected by the new value
How can I do that with code?

You can register a backward hook on your tensor. See: torch.Tensor.register_hook — PyTorch 1.13 documentation