Hi
I’m using the following code to generate the Hessian of my loss function according to variable x
the problem is that the generated Hessian is not symmetric which must be. Could anybody help if there might be some numerical instability with torch.autograd?
p.s: my model is an official PyTorch Resnet model.
out = model(x)
optimizer.zero_grad()
loss = -criterion_nr(out.squeeze(), torch.sigmoid(out.squeeze()))
loss = torch.mean(loss, dim=[1, 2])
first_drv = torch.zeros(batch_size, x_dim).to(device)
hessian = torch.zeros(batch_size, x_dim, x_dim).to(device)
for n in range(batch_size):
first_drv[n] = torch.autograd.grad(loss[n], x,
create_graph=True, retain_graph=True)[0][n]
for i in range(x_dim):
exp_hessian[n][i] = torch.autograd.grad(first_drv[n][i], x,
create_graph=True, retain_graph=True)[0][n]
print(hessian[0])
> a = torch.tensor([[ 857.2029, 196.4826],
[ 196.4857, 1563.9629]])
196.4826
and 196.4857
are different in the last digit and it makes the rest of the computation wrong.