Any way to compute the Eigenvalues of the Hessian during pptimizer.step()?

Basically the title. I’m currently coding a custom optimizer, and I’m trying to create a hessian matrix and compute eigenvalues based on all of the parameters during the .step() method.

The main problem is that for optimizer.step one must iterate over all params in param_group and check if they have gradients, which makes it difficult to piece together a matrix or compute eigenvalues of a param_group. Is there a way to compute the hessian and eigenvalues of a param_group without iterating over each of the parameters?

I was thinking of doing this by iterating and then storing all of the gradients for a group, and then finding the Hessian - but I think there might be a more PyTorch-oriented way of doing the same thing.

What I currently have in optimizer.step():

        #Iterate over parameter groups
        for param_group in self.param_groups:
            for param in param_group:
                if not param.requires_grad(): continue
                else:
                    gradients = param.grad_data
                    #This doesn't really make sense, as the hessian should be made from param_group and not from parameters
                    param["HESSIAN"] = torch.autograd.grad(gradients, param)