Getting Hessian matrix (or more higher derivatives)

Hello. I’m using PyTorch, which is built from the latest master branch. It seems this version is able to compute higher order derivatives using create_graph=True

create_graph (bool, optional) – If true, graph of the derivative will be constructed, allowing to compute higher order derivative products. from doc

Though I’ve read some discussions, I’m still not sure how to get Hessian matrix or more higher order derivatives using this.

Thank you for advance.

1 Like

at least,

a = V(T([1, 2]), requires_grad=True)
b = a[0] ** 2 + a[1] ** 2
b.backward(creat_graph=True)
a.grad.grad # this does not work
1 Like

Maybe https://github.com/pytorch/pytorch/blob/master/test/test_autograd.py#L155 will help you.

Cheers
Nabarun

Thank you.

The code is

x = Variable(torch.randn(2, 2), requires_grad=True)
y = Variable(torch.randn(2, 2), requires_grad=True)
z = x ** 2 + y * x + y ** 2
...
grad_sum = 2 * x.grad + y.grad

I think grad_sum is ∂z/∂x, but is there no way to get Hessian directly? Thank you.

Sorry I am confused. Is there no API that gives the hessian directly?

I don’t think there is a way to use Hessian directly. Instead, you can use the Hessian Vector Product.