# Hessian computation in GANs training

Hi guys i hope someone can help me on this one,
So i am trying to implement an algorithm for GANs training, i have defined my generator and discriminator lets say:
G = mygenerator()
D = mydiscriminator()
loss = myloss()
Now going through one step of what my training routine should be, once i have computed the loss (which is function of both G and D) i need to compute the gradient of this w.r.t the parameters of both networks. i will refer with X to the parameters of G and with Y to the ones of D. I do this as:

``````grad_x = autograd.grad(loss, G.parameters(), create_graph=True, retain_graph=True, allow_unused= True)
``````autograd.grad(grad_y_vec, G.parameters(), grad_outputs = grad_y_vec)