I can't reconstruct the original matrix with eigen-vector and eigen-value extracted by torch.symeig

I have a symmetric, sparse, tensor (which is a matrix) and use torch.symeig to find eigen-values and eigen-vectors of it as follow:

eig_val, eig_vec = torch.symeig(input.to_dense(), eigenvectors=True)

When I tried to reconstruct the original tensor by using the formula $input = U\Lambda U^T$

input_recons = eig_vec*torch.diag(eig_val)*torch.t(eig_vec)

But the input_recons is so different from the input and I don’t know why.

You should use torch.matmul instead of *, otherwise, it will do elementwise-product. I have a working example for a symmetric matrix:

>>> A = [[4.4103, 0.2161, 0.5112],
         [0.2161, 3.8596, 0.7494],
         [0.5112, 0.7494, 3.7300]]
>>> A = torch.tensor(A)
# Display A:
>>> A
tensor([[4.4103, 0.2161, 0.5112],
        [0.2161, 3.8596, 0.7494],
        [0.5112, 0.7494, 3.7300]])

>>> eig_val, eig_vec = torch.symeig(A, eigenvectors=True)
# dislplay eigenvalues
>>> eig_val
tensor([2.9999, 4.0000, 5.0000])
>>> eig_vec
tensor([[-0.1817, -0.7236, -0.6658],
        [-0.6198,  0.6100, -0.4937],
        [ 0.7634,  0.3229, -0.5594]])

## Reconstruct the original using V*diag(e)*V:
>>> torch.matmul(torch.matmul(eig_vec, torch.diag(eig_val)), torch.t(eig_vec))
tensor([[4.4103, 0.2161, 0.5112],
        [0.2161, 3.8596, 0.7494],
        [0.5112, 0.7494, 3.7300]])
1 Like

Oops, such a huge mistake, thanks a lot.