Using PyTorch, I am wanting to work out the square root of a positive semi-definite matrix. I googled around for a PyTorch implementation but can’t seem to find the right one.
This is what I have found:
https://github.com/steveli/pytorch-sqrtm (this implementation appears to only work for positive definite matrices. I am after an implementation which works for positive semi-definite matrices).
https://github.com/pytorch/pytorch/issues/25481 (this implementation also appears to only work for positive definite matrices. Also, the issue is still open, so I guess they haven’t finalized a final version yet).
https://github.com/msubhransu/matrix-sqrt (according to the 2nd Github link above, this implementation is not fully PyTorch, it uses PyTorch for backward pass and Scipy for forward pass. This implementation doesn’t say anything about positive definite or positive semi-definite matrices. Also, when I had a looked at the PyTorch code, I couldn’t understand the code because it doesn’t seem to have a class “function” to call?).
Anyone know where I could find a PyTorch implementation to find the square root of a positive semi-definite matrix? Would greatly appreciate it. Many thanks in advance.
Perform the eigendecomposition of your matrix and then take the
square-root of your eigenvalues. (If any of your eigenvalues of your
semi-definite matrix show up as numerically negative, replace them
with zero.)
Hi @KFrank, many thanks for your solution. The solution makes sense and it could definitely work for what I need to do. I am implementing a new type of classifier and one of the functions used in the square root of a positive semi-definite matrix.
One quick question though, if one can do this elegantly like what you have proposed, then why are people working on a PyTorch function for this which involves more complicated math/algorithms, for eg. in this Github link below especially when you see the more recent comments from the last couple of days?
The code Yaroslav posted at the beginning of the github issue to which
you linked is basically what I suggested. He (properly) treats the null
space of the semi-definite matrix more carefully, and he (properly) uses torch.symeig() rather than torch.eig().
Second comment:
More importantly, I’m not an expert, but I have no reason to believe that
eigendecomposition is the best algorithm for the root of a matrix. The
github issue discusses other approaches that could be faster and/or
numerically more stable or accurate.
The eigendecomposition contains, in a sense, more information than
the root, so it could well be more expensive to calculate. By way of
analogy, you can use eigendecomposition to calculate the inverse of
a matrix (take the reciprocals of the eigenvalues), but it is not the
preferred matrix-inverse algorithm.