Calculating Hessian Vector Product

On the Release page for 0.2, there is mention of the ability to compute higher order derivatives, including the Hessian Vector Product. Has anyone tried to implement the hessian vector product calculation? I tried looking for some examples and couldn’t find one since the latest release.


1 Like


You can use the autograd.grad() function to compute any order of derivative you want. You can check the v0.2.0 release note in the section Higher order gradients to check how to use it.

Thanks! The v0.2.0 release note is not very clear about this. could you or someone provide some example code of how to use the autograd function to calculate a hessian vector with respect to some arbitrary vector v?

Hi, suppose you like to calculate the hessian vector product of (∂^2f/∂x^2)v then,

>>> v = Variable(torch.Tensor([1, 1]))
>>> x = Variable(torch.Tensor([0.1, 0.1]), requires_grad=True)
>>> f = 3 * x[0] ** 2 + 4 * x[0] * x[1] + x[1] **2
>>> grad_f, = autograd.grad(f, x, create_graph=True)
>>> z = grad_f @ v
>>> z.backward()
>>> x.grad
Variable containing:
[torch.FloatTensor of size 2]

since (∂^2f/∂x^2)v=(∂/∂x){(∂f/∂x)v}