Diagonalizability and Invertibility of a Matrix

Is there a relationship between invertibility of a matrix and whether it can be diagonalized? For example, if a matrix is not invertible, can the matrix still be diagonalized?

Before we explore these relationships, let’s give definitions for the invertibility of a matrix, and the diagonalization of a matrix.

Invertibility of a Matrix

Before we go to far let’s define what an invertible matrix is.

Definition
An $n\times n$ real matrix $A$ is invertible (or non-singular) if there is an $n\times n$ matrix $B$ so that $AB = BA = I_n$, where the matrix $I_n$ is the $n\times n$ identity matrix. If there is a matrix $B$ so that $AB = BA = I_n$, then we write $B= A ^{-1}$, and $B$ is the inverse of matrix $A$. If a matrix is not invertible, the matrix is singular.

How Can we Tell Whether a Square Matrix is Invertible?

There are many ways to determine whether a matrix is invertible. Here are a few different approaches that are often taught in an introductory linear algebra course.

  • If every column of the matrix contains a pivot the matrix is invertible. In other words, if every column of the matrix has a leading entry in the echelon form of the matrix, then the matrix is invertible. If we row reduce our matrix to echelon form, we can more easily see if every column is pivotal and then establish whether the matrix is invertible.
  • The determinant of a singular matrix is zero. Computing the determinant of a matrix and checking whether the determinant is zero can be another way to determine whether a matrix is singular but can be time consuming for a large matrix.
  • Matrix $A$ is invertible if there is a sequence of elementary row operations that reduce $A$ to the identity. In other words, we can write $$E_pE_{p-1} \cdots E_2E_1 A = I_n$$ where each $E_i$ is an elementary matrix that applies one row operation. We could use this idea to determine whether a matrix is invertible by forming the augmented matrix $(A \, | \, I)$. If we can reduce the augmented matrix to the form $(I_n \, | \, B)$, then $A$ is invertible and its inverse if matrix $B$. 

There are many other ways we can describe invertible matrices and their properties, but we already have all we need to describe the relationship between diagonalizability and invertibility. 

When Can we Diagonalize a Matrix?

We say that a real $n \times n$ matrix $A$ is diagonalizable if we can write $A = PDP^{-1}$, where $D$ is a diagonal matrix. It can be shown that the entries on the main diagonal of matrix $D$ have to be the eigenvalues of matrix $A$, and the columns of $P$ are their corresponding eigenvectors.

When can we diagonalize a matrix? For us to be able to write $A=PDP^{-1}$ we must be able to create a square $n\times n$ matrix $P$ from the eigenvectors of $A$ in such a way that $P$ is invertible. Meaning our matrix $A$ needs to have exactly $n$ linearly independent eigenvectors in order for us to diagonalize $A$. If we have fewer than $n$ independent eigenvectors, we cannot construct $P$ and its inverse. 

When does this happen? There are several ways of answering this question.

  • Luckily, when the eigenvalues of our matrix are all distinct (none of them are equal to each other), then the matrix is diagonalizable, because eigenvectors corresponding to distinct eigenvalues are linearly independent. Meaning that because there are $n$ distinct eigenvalues, there also has to be $n$ linearly independent eigenvectors, which is all we need to be able to write $A=PDP^{-1}$. 
  • If all the eigenvalues of our matrix are not distinct, we might still able to diagonalize the matrix. 

Eigenvectors from Distinct Eigenvalues are Linearly Independent

If all the eigenvalues of a square matrix are distinct, then the eigenvectors of $A$ will span $\mathbb R^n$, and so the matrix is diagonalizable. A short proof of this idea is in another post that I wrote.

For example, the matrix $$A = \begin{pmatrix} 1 & 0 \\ 0  & 0 \end{pmatrix}$$ is diagonal, so its eigenvalues are the entries on the main diagonal. If you work out what the eigenvalues and their correspondkng eigenvectors, you would find the following.

$\lambda_1 = 0, \quad \vec v_1 = \begin{pmatrix} 1\\ 0  \end{pmatrix}, \qquad \lambda_2=1$

Note how

If a Matrix is Diagonalizable, is it Invertible?

Consider the two matrices below.

$$A = \begin{pmatrix}  1 & 0 \\ 0 & 0 \end{pmatrix}, \quad B = \begin{pmatrix} 0 & 1\\0 & 0 \end{pmatrix}$$

Both matrices are not invertible. Matrix $A$ is diagonalizable because it has distinct eigenvalues. But matrix $B$ is not diagonalizable because the only eigenvalue it has is $\lambda = 2$, and the dimension of the corresponding eigenspace is only 1. In other words,

The key point being that if a matrix is not invertible, it might be diagonalizable.

If a Matrix is Not Diagonalizable, Can it be Invertible?

Consider the matrix below.

$$A = \begin{pmatrix}  1 & 1 \\ 0 & 1 \end{pmatrix}$$

This matrix is not diagonalizable. The only eigenvalue is $\lambda = 1$, and there is only one eigenvector associated with this eigenvalue, which we can show is the vector below.

$$\vec v = \begin{pmatrix} 1 \\ 0 \end{pmatrix}$$

But the matrix $A$ is invertible. Its columns span $\mathbb R^2$.

So if a matrix is not diagonalizable, it might still be invertible.

Leave a Reply

Your email address will not be published. Required fields are marked *