How to get higher order gradients w.r.t inputs?

I am trying to compute hessian matrix of input. It is straightforward to get the 1st derivative of input. But is there a way to compute 2nd order gradients of loss w.r.t to input in pytorch?

You can compute Hessian-Vector-Products, the trick is to pass build_graph=True to torch.autograd.grad or x.backward. Then you can use gradients in computation and y.backward or torch.autograd.grad again will do the right thing.

Best regards

Thomas