Previous section

## 4.7. Numerical computation of eigenvalues

Let *A* be a diagonalizable matrix, i.e., *A* has *n* linearly independent eigenvectors
_{1 }, ..., _{n }. Assume _{1 }** R ** and
(1)

*
* | _{i } | * < * | _{1 } | *, i = 2, ..., n.
*
Let _{0 }** R**^{n}. Since {_{1 }, ..., _{n }} is a
basis for
** R**^{n}
, we have

*
*_{0 } = c_{i }_{i }.
Consider the sequence of vectors
_{k } = A_{k - 1 } :

_{1 } | * = A*_{0 } = c_{i }A_{i } = c_{i }_{i }_{i } |

_{2 } | * = c*_{i }_{i }^{2}_{i } |

| : |

_{k } | * = c*_{i }_{i }^{k}_{i } |

| = ^{k}_{1 } [ *c*_{1 }_{1 } + c_{i } (_{i } / _{1 })^{k}_{i } ] |

so that, if *c*_{1 } 0, _{k }
approaches a vector parallel to the eigenvector _{1 }, as
*k -> *.

** Example 1: Numerical approximation of the largest eigenvalue **

** Note.** If *c*_{1 } = 0 (_{0 } does
not have a component parallel to _{1 })
we have to try another initial vector.
We often choose
_{0 } = (1, 1, ..., 1)
unless it happens to be an eigenvector corresponding to another eigenvalue.
This method does not apply to complex eigenvalues.
Reason: _{1 } = u + iv => _{2 } = u - iv
is also an eigenvalue and * * | _{1 } | * = * | _{2 } | .

For the eigenvalue _{1 } we have the approximation

*
*_{1 }~ (_{k }·_{k + 1 }) / ( _{k }·_{k }) = ^{ (k)}_{1 }
because _{k }~ c_{1 }, and thus

*(*_{k }·_{k+1 }) / (_{k }·_{k }) | ~ | (c_{1 }· Ac_{1 }) / ( c_{1 }· c_{1 }) | = | |

*(c*_{1 }· c_{1 }_{1 }) / ( c_{1 }· c_{1 }) | =
| * (c*^{2}_{1 }_{1 }· _{1 }) / ( c^{2}_{1 }· _{1 })
| = | _{1 } |

Once _{1 } has been computed,
we execute the so-called deflation: using particular similarity transformations we find a matrix

which is similar to *A*, the eigenvalues of the * (n - 1)* x * (n - 1)* matrix *C* are the eigenvalues _{2 }, ..., _{n } of *A*.
After this we proceed as above.

Question: When the above procedure can be applied to find the eigenvalue of *A* of smallest absolute value?

** Note.** Eigenvalue problem is very sensitive to interference. Errors can eliminated by preprocessing the matrix
in different methods, see Grossman, Stanley: Elementary Linear Algebra.

** Example 2: An error in an eigenvalue problem**

** Note.**
It can be shown that the eigenvalues of real symmetric matrix *A*
are real and that *A* is diagonalizable.
The eigenvalues of a real skew symmetric
matrix are pure imaginary(the real part of every eigenvalue is zero).

Exercises: E49

Previous section

Contents

Next section