About the uniqueness of eigen decomposition with torch.eigh

Hi, I would like to train a network where the loss function uses the eigenvectors of two hermitian matrices to calculate a loss on them. However, as PyTorch’s documentation for eigh says, the computed eigenvalues/eigenvectors are not unique because of their phases. Is there any way to “force” the eigh function to give eigenvalues/eigenvectors only in one phase?

Hi Wasabi!

As noted in the documentation you posted a screen-shot of, what matters
is whether your loss function depends on the (non-unique) phase of your
eigenvectors. If it doesn’t, things will be fine and autograd will backpropagate
correctly trough eigh().

If your loss function does depend on the phase of your eigenvectors, then
you are trying to do something that doesn’t make sense. Even if you were
somehow able to coerce your eigenvectors to have well-defined phases, you
would have merely swept the issue under the rug. (You could likely expect
your loss function to have some discontinuous jump at some sensible set of
values for your parameters or other pathology.)

As an aside, if you have degenerate eigenvalues, then the whole eigenvector
(not just its phase) is not unique. I think in such a case that autograd would
raise an exception were you to attempt to backpropagate. Regardless, if
your loss function were to depend only on the (multidimensional) subspaces
spanned by the sets of degenerate eigenvectors, rather than on any specific
eigenvector, backpropagation of your loss function would be mathematically
well-defined, even if autograd and eigh() couldn’t handle it.

Best.

K. Frank

1 Like