About the gradient of intermediate variables w.r.t. input

Is there a good way to compute the following: suppose x is a tensor that requires gradient, and I have two functions f(x) and g(x), we define h(x) = g(x)/f(x). When I call h.backward() (I need to find the gradient \partial h/ \partial x, how can I also extract \partial f/ \partial x efficiently since f is also part of the computation graph?

Help would be much appreciated!