Seven Different Ways to Obtain the Eigenvalues of a Matrix
When asked to compute the eigenvalues of a real $n \times n$ matrix \(A\), students in an introductory linear algebra course are often taught that they can calculate the roots of the polynomial $\text{det} (A – \lambda I)$. The roots of the polynomial are the eigenvalues of \(A\).
But there are situations when other methods for computing eigenvalues are more efficient. Other methods can also provide a quick way to verify that the eigenvalues that are found, using any method, are correct.
Here are a few situations where we can determine the eigenvalues of a square matrix without relying on computing roots of the characteristic polynomial.
Eigenvalues of Triangular Matrices
\(A\) is a triangular matrix if all the elements above and/or below the main diagonal are zero. The other elements can be anything.
We say that a matrix is upper triangular matrix if all of its entries below the main diagonal are zero. The matrices below are upper triangular.
$$\begin{pmatrix} 4 & 3 \\0 & 2 \end{pmatrix}, \quad \begin{pmatrix} 0 & 2 \\0 & 2 \end{pmatrix}$$
If A is triangular, then its eigenvalues are the entries on the main diagonal.
For example, suppose $$A = \begin{pmatrix} 2 & 0 &1 \\ 0 & 4 & 7 \\ 0 & 0 & 3 \end{pmatrix}$$ This matrix is upper triangular because the elements below the main diagonal are zero. And by inspection, the eigenvalues of this matrix are 2, 3, and 4.
We can verify that these are the eigenvalues by computing the characteristic polynomial, $$ \text{det} \left( A – \lambda I \right) = \text{det} \begin{pmatrix} 2- \lambda & 0 &1 \\ 0 & 4 – \lambda & 7 \\ 0 & 0 & 3 – \lambda \end{pmatrix} = (2 – \lambda ) (4 – \lambda ) (3 – \lambda) = 0$$
Indeed, our eigenvalues are 2, 3, and 4.
Eigenvalues of Singular Matrices
If \(A\) is singular, at least one of its eigenvalues must be zero.
Why? Here are two ways of looking at this relationship.
- The eigenvalues of a matrix are the values of \(\lambda\) that make \(A – \lambda I\) singular. If the matrix \(A\) is already singular, then \(A – \lambda I = A – 0I = A\) is singular for \(\lambda = 0\).
- If zero is an eigenvalue, then $$A\vec v = 0\vec v = \vec 0$$ But if \(\vec v \ne \vec 0\), then the only way there could be a solution to \(A\vec v = \vec 0\) is if the matrix \(A\) is not invertible.
Eigenvalues of Stochastic Matrices
A stochastic matrix is a square matrix whose columns sum to 1 and whose entries are all between 0 and 1.
If a matrix is stochastic, then (at least) one of the eigenvalues is equal to 1.
Why is this the case? Below is a rough outline of why.
- The eigenvalues of a square matrix are the roots of the characteristic polynomial, \(\text{det} (A – \lambda I)\).
- The determinant of a matrix is equal to the determinant of its transpose, so \(\text{det} (A – \lambda I) = \text{det} ( \, (A – \lambda I)^T )= \). Therefore, a matrix and its transpose will have the same eigenvalues (but not necessarily the same eigenvectors).
- If \(A\) is stochastic its columns sum to 1, so the rows of \(A^T\) will sum to 1.
- If \(A\) is \(n\times n\) stochastic, and \(p\) is a vector with \(n\) entries that are all equal to 1, then \(A^T p = p =(1) p\)
The last item implies that 1 is always an eigenvalue of a stochastic matrix.
Eigenvalues of Rotation-Dilation Matrices
A $2\times2$ rotation-dilation matrix has the form $$\begin{pmatrix} a & -b \\ b & a\end{pmatrix}$$
The eigenvalues of rotation-dilation matrices are always a ± ib.
For example, the eigenvalues of this matrix
$$\begin{pmatrix} 1 & -2 \\2 & 1 \end{pmatrix}$$
are $\lambda = 1 \pm 2i$.
This useful result comes directly from the characteristic polynomial:
$$\text{det} \begin{pmatrix} a – \lambda & -b \\ b & a – \lambda \end{pmatrix} = \lambda^2 – 2a\lambda + a^2 + b^2$$
The roots of this polynomial are \(a \pm i b\).
The Trace of a Matrix is the Sum of the Eigenvalues
The trace of a matrix is the sum of its entries on the diagonal.
The trace of a matrix is equal to the sum of its eigenvalues.
There are a few different ways to prove this result, the Problems in Mathematics website has a succinct proof of this result.
For example, the trace of
$$A = \begin{pmatrix} 2 & 8 \\ 6 & 24 \end{pmatrix}$$
is [latex}2 + 24 = 26\). By inspection, the matrix is singular, so one eigenvalue is \(\lambda_1 = 0\). Thus, the other eigenvalue can be found using the trace:
$$\lambda_1 + \lambda_2 = 26 \Leftarrow \lambda_2 = 26$$
A – λI Has to Be Singular
Knowing that \(\lambda\) is a number that makes \(A – \lambda I \) singular can sometimes allow us to pick off eigenvalues very quickly by inspection.
Without any calculation, can you determine the eigenvalues of this matrix by inspection?
$$A = \begin{pmatrix} 2 & 0 & 0 \\ 0 & 2 & 1 \\ 0 & 1 & 2 \end{pmatrix}$$
Knowing that an eigenvalue \(\lambda\) are the numbers that make \(A – \lambda I\) singular, we can see that:
- 1 is an eigenvalue because the last two columns of \(A – (1) I \) are identical, which means that we have a singular matrix
- 2 is an eigenvalue because the firs column of \(A – (2) I \) is a column of zeros, which means that we have a singular matrix
- 3 is an eigenvalue because the last two columns of \(A – (1) I \) are multiples of each other, which means that we have a singular matrix
Here is another example.
$$A = \begin{pmatrix} 4 & 1 & 1 & 1 \\ 1 & 4 & 1 & 1 \\ 1 & 1 & 4 & 1 \\ 1 & 1 & 1 & 4 \end{pmatrix}$$
Subtracting 3 from the main diagonal yields a matrix that is singular (because every column is identical), so 3 is one of the eigenvalues of this matrix.
There Are Many Ways to Compute Eigenvalues
There are several other ways to determine eigenvalues that do not require use of the characteristic polynomial. Knowing properties of eigenvalues gives a deeper insight into what eigenvalues are and their properties.