Use grad() graph assuming independence between inputs to compute an expression

Assume I have 3 neural networks (linear layer for example):

nn1 = nn.Linear(1, 1) # takes t, generates x
nn2 = nn.Linear(1+1, 1) # takes t and x, generates y
nn3 = nn.Linear(1+1+1, 1) # takes t, x, and y, generates z

And I have some unknown expression h that takes all outputs (x, y, and z) and yield some value.

t = th.rand(10, 1)
x = nn1(t)
y = nn2(th.cat([t, x], dim=1))
z = nn3(th.cat([t, x, y], dim=1))

h = 3*x*y+z*y + 4 # Assume h is not known

Can I compute a function/expression/graph/etc for dh/dy (for example) that assumes the variables x, y, and z are independent, but that I can use to do back-propagation acknowledging their dependence?

Another perceptive:
In the example above, can I use grad to compute hy = 3*x + z (assuming I don’t know h initally) with the information given above?

idk but there could be two options that
first one is just
h.backward() and y.grad()= ah/ay
if y is declared tensor like t

second one is
h.backward() gives t.grad()= ah/at,
zero_grad()
y.backward() gives t_new.grad()= ay/at,

ah/ay=ah/at * at/ay = t.grad() / t_new.grad()

How about using these?