2
$\begingroup$

I need to find eigenvalues/eigenvectors of different kinds of $n \times n$ matrices. For example, how would I determine these for the matrices listed below? What is the typical process? Should I always go by the route of finding eigenvalues by finding roots of characteristic polynomial and then getting eigenvectors by solving $(\mathbf{A} - \lambda \mathbf{I})\mathbf{x} = 0$?

$\begin{bmatrix} 2&0&0\\ 1&2&0\\ 0& 1 & 2 \end{bmatrix} $

$\begin{bmatrix} 4 &1 &1 &1 \\ 1&4 &1 &1 \\ 1&1 &4 &1 \\ 1& 1& 1& 4 \end{bmatrix}$

These are just examples. Typically I want to find eigenvectors of $n \times n$ matrices. If you can show me the process of finding solution of one of these matrices, that would be helpful.

  • 3
    http://en.wikipedia.org/wiki/Eigenvalue_algorithm2010-08-09
  • 0
    The first example you gave is (lower) triangular; thus the diagonal elements are eigenvalues. By hand, you can try to generate the characteristic polynomial; a computer, however, will (attempt to) reduce your matrix to a (quasi-)triangular matrix due to the ease of computing a triangular matrix's eigenvalues.2010-08-09
  • 0
    yes but question is concerning eigenvectors which is a more challenging problem.2010-08-09
  • 0
    are these not solvable by hand?2010-08-09
  • 0
    @saminny: See Qiaochu's link, in particular, [this section](http://en.wikipedia.org/wiki/Eigenvalue_algorithm#Identifying_eigenvectors) for an example of how to calculate eigenvectors. If there is something you don't understand about that article, ask a specific question.2010-08-09
  • 0
    Also this might be interesting: **Computing Eigenvalues and Eigenvectors without Determinants** [jstor.org/stable/2691340](http://www.jstor.org/stable/2691340)2011-09-13

2 Answers 2

2

The standard, algorithmic/recipe, don't-think-too-much-about-it way of doing it is to compute the determinant of $A-\lambda I$ to get the characteristic polynomial, use the characteristic polynomial to find the eigenvalues, and then using each eigenvalue to compute the nullspace of $A-\lambda_iI$ to get the eigenvectors.

However, there are often sundry shortcuts. For example:

  1. The trace of the matrix equals the sum of the eigenvalues (over the complex numbers) and the determinant equals to product of the eigenvalues. This can often help you find some, if not all, the eigenvalues.

  2. If every row of the matrix adds up to the same constant $c$, then $c$ is an eigenvalue, and $(1,1,\ldots,1)$ is an eigenvector of $c$.

  3. Since the eigenvalues of $A$ and the eigenvalues of $A^t$ are the same, if every column of $A$ adds up to the same constant $c$, then $c$ is an eigenvalue.

  4. If $A$ has eigenvalue $\lambda$ with eigenvectors $\mathbf{b}_1,\ldots,\mathbf{b}_m$, then $aA+cI$ has eigenvalue $a\lambda+c$ with eigenvectors $\mathbf{b}_1,\ldots,\mathbf{b}_m$.

  5. If $A$ is block-diagonal or block-triangular, then the eigenvalues of $A$ are the eigenvalues of the diagonal blocks.

For instance, your second matrix is the same as $$\left(\begin{array}{cccc} 1&1&1&1\\ 1&1&1&1\\ 1&1&1&1\\ 1&1&1&1 \end{array}\right) + 3I.$$ The first matrix has eigenvalues $0$ (because it's not invertible) and $4$ (because every row adds up to $4$), so your matrix will certainly have eigenvalues $0+3=3$ and $4+3=7$. That gives you two of the eigenvalues. From inspection, it is clear that the matrix with all $1$s has eigenvectors $(1,1,1,1)$ associated to $4$, and that $(1,-1,0,0)$, $(1,0,-1,0)$, and $(1,0,0,-1)$ are eigenvectors associated to $0$, and that's it, so these eigenvectors are also eigenvectors of your original matrix, associated to $7$ (the first one) and to $3$ (the second through fourth ones).

Your first matrix is block triangular (actually, triangular) so the eigenvalues are the eigenvalues of the diagonal bocks (here, the $1\times 1$ blocks $2$). So the only eigenvalue is $2$. There is an obvious eigenvector, $(0,0,1)$, and since the rank of $A-2I$ is $2$, the nullity is $1$ so that's the only eigenvector.

Most of these are heuristics, as opposed to algorithms. Of course, the "algorithmic" nature of "compute determinant, find characteristic polynomial, find eigenvalues, find eigenvectors" also includes a hidden heuristic component, in that factoring the polynomial often uses heuristics.

  • 2
    As a tiny addendum: Arturo's indications are for *computation by hand*. If you're programming a computer to calculate eigenvalues, [different methods](http://netlib.org/eispack/) are needed.2011-09-13
1

Just a little warning about the first example: let's call that matrix $J$. It is a (non-diagonal) Jordan matrix, so it is not diagonalizable. That is, there is no base of eigenvectors for this matrix, although, as J. M. observed, 2 is the only eigenvalue and the last vector of the standard $\mathbb{R}^3$ basis is an eigenvector, since, as you can see, $Je_3 = 2e_3$.

You can convince yourself that $J$ is not diagonalizable computing its characteristic polynomial: $Q(t) = (2-t)^3$ and observing that $2$ has multiplicity $3$, but the eigenspace $\ker (J-2I)$ has dimension $1$.

  • 0
    Haha, I only noticed just now: the first matrix is a transposed Jordan block! XD2010-08-09