# Torch.pinverse() seems to be inaccurate

I encountered an unexpected failure in an algorithm I am developing and I have tracked it down to inaccuracy in the PyTorch implementation of the pseudoinverse.

To investigate this I wrote a pseudoinverse function using the QR decomposition:

``````def pinv(A):
"""
Return the pseudoinverse of A,
without invoking the SVD in torch.pinverse().

Could also use (but doesn't avoid the SVD):
R.pinverse().matmul(Q.t())
"""
rows,cols = A.size()
if rows >= cols:
Q,R = torch.qr(A)
return R.inverse().mm(Q.t())
else:
Q,R = torch.qr(A.t())
return R.inverse().mm(Q.t()).t()

``````

I tested a random tall & thin matrix and it’s transpose:

``````>>> A = torch.randn(20,10)
>>> B = A.t()
``````

I checked the accuracy of the inverses:

``````>>> torch.dist(torch.eye(10), pinv(A).mm(A))
tensor(5.9935e-07)
>>> torch.dist(torch.eye(10), A.pinverse().mm(A))
tensor(1.5399e-06)

>>> torch.dist(torch.eye(10), B.mm(pinv(B)))
tensor(5.9935e-07)
>>> torch.dist(torch.eye(10), B.mm(B.pinverse()))
tensor(1.6085e-06)
``````

So, it seems that the built-in torch.pinverse() has some accuracy problems (I suspect this comes from the SVD implementation).

Any linear algebra experts on this forum have any comments on this?
Should I worry about the R matrix in the decomposition possibly being not invertible?

Update:
After some more testing I did find a case where R is singular.

So I guess a better solution is:

``````def pinv(A):
"""
Return the pseudoinverse of A using the QR decomposition.
"""
Q,R = torch.qr(A)
return R.pinverse().mm(Q.t())
``````

This doesn’t solve the accuracy problem in the SVD, but it does solve my original problem where the mostly integer and low-precision values in my matrices are now being computed correctly.

1 Like

I have a similar problem as posted here https://github.com/pytorch/pytorch/issues/18558. Does anyone know how to solve it?