Computing Jacobian/Hessian Vector Product

I’m working on a research problem where I have a GAN, and I need to compute the following: Let $A_{ij}=\frac{L}{\theta^G_i \theta^D_j}$ where $L$ is the loss function. I need to compute $Av$ for a vector $v$. How can I do that?
One thing I tried is to compute the Hessian-vector product using autograd.functional.hvp on an auxiliary function that I feed (D.parameters(),G.parameters()), and it returns the loss. However, I get all zeros because the function dosen’t directly depend on its inputs. How can I fix this?
Thanks in advance!