How to change the gradients of some parameters?

I’m trying to change the gradients of some parameters and at the same time the updated gradients can flow backward.

For instance, there are four variables(A,B,C,D) and the corresponding computational graph:
A->B->C->D

When I change the gradient of C (dD/dC=tensor from outer world not in-place), the gradients of A,B are not changed even though the gradients of C are changed.

in-place method-register_hook-is not helpful in this case because I wanna change the gradients with other values (it means that dD/dC should be changed with some tensors from other computation graph) .

In a word,
C.grad = tensor: dosn’t apply to backward
C.register_hook(): onyl with in-place operation

is there any way to cope with this case?

1 Like