How to take gradient with respect to activation values using torch.autograd.grad?

Hi,

I am new to pytorch and register hook and I am wondering how i can get the gradient with respect to intermediate values when i calculate hessian vector product using autograd:

 grad_grad = torch.autograd.grad(grad_vec, self.model.parameters(), grad_outputs=vec, retain_graph=True)

where grad_vec shows gradient and vec is the vector i want to multiply by hessian.