Numerical method for the Eigenvectors of a 3×33\times3 symmetric matrix

I am looking for an ad-hoc algorithm to quickly find the Eigenvectors of a symmetric 3×33\times3 matrix. I already have a solution for the Eigenvalues by direct resolution of the characteristic equation.

I want a solution that is efficient (presumably non-iterative), yet is able to handle corner cases such as equal Eigenvalues. An option could be to unroll a general algorithm to the case 3×33\times3, but this can get complicated.

As the Eigenvalues are available Gaussian elimination is possible, but I don’t see how it could handle equal Eigenvalues.

Any suggestion ?

Update:

Vectorially, when λ\lambda is an Eigenvalue the three column vectors of M−λIM-\lambda I are coplanar. Then the coefficients of the linear combination of those three vectors correspond to the vector normal to the plane.

Numerically, one can form the three cross-products of the vectors in pairs and keep the product with the largest norm. This involves the computation of nine 2×22\times2 minors and three dot squares (1515 additions and 2727 multiplications), and must be repeated for the three Eigenvalues.

When two Eigenvalues are equal, the vectors are parallel so that the cross-products cancel.

=================

=================

2 Answers
2

=================

Gaussian elimination will give you a non-orthogonal basis for an eigenspace even in the case where there are equal eigenvalues. You can then orthogonalize this basis by Gram-Schmidt. You will have to pick some tolerances for deciding when two or three eigenvalues are equal.

  

 

Could one imagine a solution such that you needn’t make any decision about equality of the Eigenvalues ?
– Yves Daoust
Oct 20 at 19:41

  

 

With Gaussian elimination you’ll always need a tolerance parameter of some sort, so any solution involving Gaussian elimination involves making decisions about whether two floating point values are “close.” If you try Gaussian elimination with one eigenvalue and find that the system has a null space of dimension 2 then you’ve got a double eigenvalue. If the dimension of the nullspace is 3, then you’ve got a triple eigenvalue.
– Brian Borchers
Oct 20 at 19:47

  

 

As a practical matter, you may find that Francis’s algorithm is so fast on a 3×3 matrix that there’s no point in trying to craft something by hand.
– Brian Borchers
Oct 20 at 19:48

  

 

If I am right, Gaussian elimination would cost like 9+49+4 comparisons to choose the pivots and the computation of 66 2×22\times2 determinants (in the standard case). This is my target complexity.
– Yves Daoust
Oct 20 at 19:59

The problem you are talking about is numerically unstable: at the identity matrix, any orthogonal basis could be your answer, but by arbitrarily small perturbations of the identity matrix you could force any one particular orthogonal basis to be the unique (up to scalar factors −1-1) answer. Therefore your goal of having a single (continuous) formula, parametrised by the eigenvalue, describing the eigenvectors is unattainable.

Also it is not clear to me how you would go about having a formula in the case of multiple eigenvalues; do you imagine having a separate formula for the first (chosen) eigenvalue at λ~\lambda, then another formula for the second one, valid only if λ\lambda is a multiple eigenvalue? In any case the stability argument shown no continuous formula for the first eigenvalue can exist.

  

 

Yes, I understand this phenomenon. In case of equality, you need a mechanism to “break ties” and choose among an infinity of possible directions, and this is a source of discontinuity. Besides detecting equal Eigenvalues and switching to an ad-hoc procedure, I like the idea of perturbations: wouldn’t it be possible to add a data-dependent perturbation which has a small effect far from the singular case and has the tie-breaking effect close to it ?
– Yves Daoust
2 days ago

  

 

@YvesDaoust I don’t really see what kind of data-dependence you have in mind, and adding a random perturbation is equally likely to make the situation more singular than less singular than it was (unless you are working from the hypothesis that is was singular). So in short, no I don’t think that would be possible/practical.
– Marc van Leeuwen
16 hours ago