Computing Hessian

x = torch.ones(1)
x = Variable(x,requires_grad=True)
y = x*x
y.backward()
torch.autograd.grad(x.grad,x)

Why does this method of computing hessian not work?

RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

Hi Mayank,
That code will not work as the error message clearly says.

As I understand it, you want to calculate the double derivative of a function y wrt the input x. Here’s how it could be done:

import torch

x = torch.ones(1, requires_grad=True)
y = x * x

first_grad = torch.autograd.grad(y, x, create_graph=True)[0] # first derivative. also note the use of create_graph=True is to ensure that the graph of the first derivative is constructed which is required compute higher order derivatives

second_grad = torch.autograd.grad(grad_y, x)[0] # this is what you need