Differentiable Linear Solver

Is there a linear solver in PyTorch whose differentiation is implemented ? Is it planned to implement one soon ? If not, how do I go about differentiating it by hand ?

I feel like I’ve looked at all the solvers of http://pytorch.org/docs/master/torch.html#blas-and-lapack-operations, and non of them implement the differentiation. I guess this file https://github.com/pytorch/pytorch/blob/master/tools/autograd/derivatives.yaml is the one to look at to see if derivatives are implemented.

Thank you

I think torch.gesv is differentiable

Yep, that is correct, thank you ! I guess I didn’t look hard enough…

Hi, do you encounter a problem which is the torch.gesv function is much slower on GPU than on CPU when the matrix A is small, such as 30*30?

Hi, do you encounter a problem which is the torch.gesv function is much slower on GPU than on CPU when the matrix A is small, such as 30*30?@matthieuheitz

Yes, I think it might come from the overhead of loading the data on the GPU memory.
You should probably use CPU for small matrices.

1 Like

Is there any detailed documentation of the implementation details of torch.gesv? What method is it using? Gaussian elimination? What formula does it use to compute the gradients?

As mentionned here, it uses the LU factorization. As for the gradient, it is implemented here : https://github.com/pytorch/pytorch/blob/e3e15b5d9534f1c10170e169f1423e9298648e86/tools/autograd/templates/Functions.cpp#L353