Simultaneous evaluation of gradient and hessian

Hi, I have a simple question: does anyone know a straightforward method to compute both the gradient and hessian of a scalar function simultaneously using autograd? For example, when you use the torch.hessian function, the gradient is computed during this process - how do I get access to it?
In the standalone autograd library (outside of pytorch) I had to edit the sourcecode in order to do this, I was hoping I wouldn’t have to do the same here!

Thank you to anyone who replies, Sean.

Hi @FOXP20,

You can use torch.func package to get this. You can simply define a function, which computes the derivative and create a copy of it. Then you can compute the derivative again (and return the copy of the 1st derivative at the same time as an auxiliary variable).

Here’s a minimal reproducible example,

import torch
from torch.func import jacrev #reverse-mode AD

def func(x):
  return x**2

def jacobian_func(x):
  jac = jacrev(func, argnums=(0))(x)
  return jac, jac #return 2 copies here
  
def hessian_with_grad_func(x):
  hess, jac = jacrev(jacobian_func, argnums=(0), has_aux=True)(x)
  return hess, jac
  
x = torch.randn(1) #if you have more samples, you'll need to use torch.func.vmap

hess, jac = hessian_with_grad_func(x)
print("output: ",x)
print("jacobian: ",jac)  #equals 2*x
print("hessian: ",hess)  #equals 2 
1 Like

Fantastic that was exactly the type of response I was looking for - thank you!