Modify gradient computation of `torch.linalg.svd`

I am performing a singular value decomposition during the forward pass using torch.linalg.svd. There are some singular values that are repeated. As a result, the gradients explode during the backward pass since 1/(sigma_i**2-sigma_j**2) becomes nan.

What I want to do is modify the SVD backward calculation by adding torch.eps to the denominator of the gradient computation, so that nans are avoided. How can I do this?

You could write a custom autograd.Function use the linalg.svd method in the forward and implement your custom backward function using this tutorial.