Torch.autograd.functional.hessian

I am trying to calculate the trace of Hessian of a sample neural network model.

I am using torch.autograd.functional.hessian which basically returns a tuple of tuple of Tensors

for simplicity, let’s assume we have a linear model with 2 inputs and 2 outputs this will be my Hessian in this case for a random input and chosen loss function.

H =
((tensor([[[[ 0.1600,  0.1540],
            [-0.1600, -0.1540]],
  
           [[ 0.1540,  0.1482],
            [-0.1540, -0.1482]]],
  
  
          [[[-0.1600, -0.1540],
            [ 0.1600,  0.1540]],
  
           [[-0.1540, -0.1482],
            [ 0.1540,  0.1482]]]], device='cuda:0', grad_fn=<ViewBackward0>),
  tensor([[[ 0.1972, -0.1972],
           [ 0.1898, -0.1898]],
  
          [[-0.1972,  0.1972],
           [-0.1898,  0.1898]]], device='cuda:0', grad_fn=<ViewBackward0>)),
 (tensor([[[ 0.1972,  0.1898],
           [-0.1972, -0.1898]],
  
          [[-0.1972, -0.1898],
           [ 0.1972,  0.1898]]], device='cuda:0', grad_fn=<ViewBackward0>),
  tensor([[ 0.2431, -0.2431],
          [-0.2431,  0.2431]], device='cuda:0', grad_fn=<ViewBackward0>)))

How can I calculate the trace of Hessian in this case? torch.trace didn’t work for me any ideas or tips?