Hello, I want to do a kind of custom backpropagation with separate backward chains, but I don’t know how to do it with autograd.

Say I have a set of operations as:

```
y = f1(x)
u = f2(y)
v = f3(y)
z = f4(u, v)
```

Then, the the backward chain should be:

```
dz/dx = dz/dy *dy/dx
= (dz/du * du/dy + dz/dv * dv/dy) * dy/dx (1)
```

the above equation (1) can be further parted into two chains as:

```
dz/dx = (dz/du * du/dy * dy/dx) + (dz/dv *dv/dy * dy/dx) (2)
```

However, I want to modify one branch of the backpropagate chain in equation (2), for example:

```
dz/dx = (dz/du * du/dy * dy/dx)*mean(dz/du * du/dy * dy/dx) + (dz/dv *dv/dy * dy/dx)
```

My understanding is that the autograd compute dz/dx according to equation (1) not equation (2). So the two backpropagate chains may never exist in the autograd computation. How can I implement this gradient modification in autograd?