I have implemented here the code:
def calc_grad_hessian(func, x):
x = torch.tensor(x, dtype=torch.float, requires_grad=True)
y = func(x)
grad = torch.autograd.grad(y, x, create_graph=True)
hessian = torch.zeros(x.shape, x.shape)
for i in range(x.shape):
hessian[:,i] = torch.autograd.grad(grad[i], x, retain_graph=True)
return y.detach().numpy(), grad.detach().numpy(), hessian.detach().numpy()
Even I have decleared that the requires_grad of x is true, it always says there is an error in
grad = torch.autograd.grad(y, x, create_graph=True):
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
Can anyone tell me why?
Thank you in advance.