Getting incorrect negative signs when doing SVD, why?

I listed my code below. I am trying to prove that left-singular vectors are eigenvectors of AA* and right-singular vectors are eigenvectors of A*A, but I am getting weird negative signs that are throwing off my answer even though the absolute values are the same.

import torch

P = torch.tensor([[25, 2, -5.],
                  [3, -2, 1],
                  [5, 7, 4]])

# Verifying SVD process, works!
print(f"Verifying SVD process, works!\nOriginal matrix:\n{P}")
U, D, VT = torch.linalg.svd(P)
print(f"U:\n{U}\nD:\n{D}\nV:\n{VT.T}")
print(f"Getting orginal matrix from U, D, V values:\n{U@torch.diag(D)@VT}")
print("-"*25)

# Representing U, D, and V as eigenvectors/eigenvalues of P@P.T and P.T@P and verifying SVD process, doesn't work!
print(f"Representing U, D, and V as eigenvectors/eigenvalues of P@P.T and P.T@P and verifying SVD process, doesn't work!\nOriginal matrix:\n{P}")
e_val_1, e_vec_1 = torch.linalg.eig(P@P.T)
e_val_2, e_vec_2 = torch.linalg.eig(P.T@P)
e_vec_2 = e_vec_2[torch.tensor([0,1,2])][:,torch.tensor([0,2,1])].to(torch.float)
e_vec_1 = e_vec_1.to(torch.float)
srt, ind = torch.sort(torch.sqrt(e_val_2).to(torch.float),descending=True)

# U, D, V values are not the same as above, has same absolute values but different signs, but why?
print(f"(These values don't match original) U:\n{e_vec_1}\nD:\n{srt}\nV:\n{e_vec_2}")

print(f"Getting orginal matrix from U, D, V values:\n{e_vec_1.to(torch.float)@torch.diag(srt)@e_vec_2.to(torch.float).T}")

# Need to make following modifications to make code work, but why?
e_vec_2 = -e_vec_2
e_vec_1[:,1] = -e_vec_1[:,1]

print(f"After making modifications (listed in code):\n{e_vec_1.to(torch.float)@torch.diag(srt)@e_vec_2.to(torch.float).T}")

Hi Manaswi!

The short story is that your negative signs represent a legitimate phase
ambiguity, rather than being incorrect.

I haven’t looked at your code in any detail, but the basic issue is that
neither the singular vectors in the singular-value decomposition nor the
eigenvectors in the eigendecomposition are uniquely defined. Instead
they can carry an arbitrary phase, which in the real case, is an arbitrary
sign.

Quoting, for example, from pytorch’s svd() documentation:

Warning

The returned tensors U and V are not unique, nor are they continuous with respect to A. Due to this lack of uniqueness, different hardware and software may compute different singular vectors.

This non-uniqueness is caused by the fact that multiplying any pair of singular vectors uk,vk​ by -1 in the real case or by eiϕ,ϕ∈R in the complex case produces another two valid singular vectors of the matrix. For this reason, the loss function shall not depend on this eiϕ quantity, as it is not well-defined. This is checked for complex inputs when computing the gradients of this function. As such, when inputs are complex and are on a CUDA device, the computation of the gradients of this function synchronizes that device with the CPU.

Best.

K. Frank