It seems that the symeig()
function does not return proper eigenvectors. See the following simple test script:
import torch
# Create a random symmetric matrix
p, q = 10, 3
torch.manual_seed(0)
in_tensor = torch.rand(p, q, dtype=torch.float64, requires_grad=True).cuda()
cov_in = torch.mm(in_tensor.t(), in_tensor)
_, eig_vecs = torch.symeig(cov_in)
print(eig_vecs)
print(torch.mm(eig_vecs, eig_vecs.t()))
print(torch.mm(eig_vecs.t(), eig_vecs))
print(eig_vecs.norm(dim=0))
print(eig_vecs.norm(dim=1))
Here is the result:
tensor([[ 0.3573, 0.0288, 0.4334],
[ 3.1050, 6.5767, -3.2518],
[ 2.3730, 2.2232, 2.1961]],
device='cuda:0', dtype=torch.float64, grad_fn=<SymeigBackward>)
tensor([[ 0.3164, -0.1103, 1.8639],
[-0.1103, 63.4675, 14.8483],
[ 1.8639, 14.8483, 15.3969]],
device='cuda:0', dtype=torch.float64, grad_fn=<MmBackward>)
tensor([[ 15.3995, 25.7064, -4.7303],
[ 25.7064, 48.1965, -16.4908],
[ -4.7303, -16.4908, 15.5848]],
device='cuda:0', dtype=torch.float64, grad_fn=<MmBackward>)
tensor([3.9242, 6.9424, 3.9478],
device='cuda:0', dtype=torch.float64, grad_fn=<NormBackward1>)
tensor([0.5625, 7.9667, 3.9239],
device='cuda:0', dtype=torch.float64, grad_fn=<NormBackward1>)
Is there a bug in the symeig()
function?