RuntimeError: One of the differentiated Tensors appears to not have been used in the graph

I am facing the error

RuntimeError: One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior.'

when executing the code below. I have tried setting allow_unused=True but this just returns a None grad. I thought that torch.argmin may be causing an issue, but autograd.grad works fine if I differentiate w.r.t the second argument, just not the first. In the setup below I have n=5 and inv_cov is an inverse covariance matrix of some data.

def score_fn(diagonal, invcov):
    # inputs
    # invcov: dim x dim torch array
    # diagonal: 1 x dim torch array
    # output
    # score: 1 dimensional tensor

    dim = diagonal.shape[0]
    dag = torch.zeros((dim, dim))

    for t in range(dim):
        ind = torch.argmin(torch.diag(invcov * torch.diag(diagonal)))
        dag[ind, :] = - invcov[ind, :] / invcov[ind, ind]
        dag[ind, ind] = 0
        invcov = invcov - torch.outer(invcov[:, ind].T, invcov[ind, :]) / invcov[ind, ind]
        invcov[ind, ind] = 1e6

    score = torch.linalg.norm(dag)

    return score


d = torch.ones(n)
d.requires_grad = True
sc = score_fn(d, inv_cov)
print(torch.autograd.grad(sc, d))

I have looked over the other forum posts for this error but the suggestions do not resolve my issue. Any help would be appreciated. Thanks!

The d tensor which requires gradients seems to be only used in:

ind = torch.argmin(torch.diag(invcov * torch.diag(diagonal)))

which will detach it from the computation graph since torch.argmin is not differentiable.