Now assume I have a module denoted as function $f, g$ which takes input $x$ and parameters $\theta_f, \theta_g$. The loss is $L = g(z, \theta_g)$ and $z = f(x, \theta_f)$.
When the backpropagation starts, we have $\frac{\partial L}{\partial \theta_g}$ and $\frac{\partial L}{\partial z}$. Then we use the second partial derivative to calculate $\frac{\partial L}{\partial \theta_f}$.
When we call the backward(), the autograd will do the above operation and then give us $\frac{\partial L}{\partial \theta_g}$ and $\frac{\partial L}{\partial \theta_f}$, but is there a way that I can get $\frac{\partial L}{\partial z}$?
btw, the display of equations seems went rong