Section 3.12 Diagonalization
Checkpoint 3.12.1.
Compute the following matrix products.
\(\displaystyle \left[ \begin{array}{rrr} 1 \amp 2 \amp 3 \\ 4 \amp 5 \amp 6 \\ 7 \amp 8 \amp 9 \end{array} \right]\left[ \begin{array}{ccc} 10 \amp 0 \amp 0 \\ 0 \amp 100 \amp 0 \\ 0 \amp 0 \amp 1000 \end{array} \right]=\)
\(\displaystyle \left[ \begin{array}{ccc} 10 \amp 0 \amp 0 \\ 0 \amp 100 \amp 0 \\ 0 \amp 0 \amp 1000 \end{array} \right]\left[ \begin{array}{rrr} 1 \amp 2 \amp 3 \\ 4 \amp 5 \amp 6 \\ 7 \amp 8 \amp 9 \end{array} \right]=\)
Checkpoint 3.12.2.
Use that \(\vect{v_1} =(3,0,1)\text{,}\) \(\vect{v_2} =(0,2,1)\) and \(\vect{v_3} =(1,1,1)\) are the eigenvectors corresponding to eigenvalues \(\lambda_1 =0\text{,}\) \(\lambda_2=2\) and \(\lambda_3=1\) for the matrix \(\mtx{A} = \left[ \begin{array}{rrr} -2 \amp -3 \amp 6 \\ 2 \amp 5 \amp -6 \\ 0 \amp 1 \amp 0 \end{array} \right]\) to fill in the blanks below.
\(\left[ \begin{array}{rrr} -2 \amp -3 \amp 6 \\ 2 \amp 5 \amp -6 \\ 0 \amp 1 \amp 0 \end{array} \right]\left[ \begin{array}{ccc} \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \end{array} \right] = \left[ \begin{array}{ccc} \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \end{array} \right]\left[ \begin{array}{ccc} \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \end{array} \right]\)
and
\(\left[ \begin{array}{rrr} -2 \amp -3 \amp 6 \\ 2 \amp 5 \amp -6 \\ 0 \amp 1 \amp 0 \end{array} \right] = \left[ \begin{array}{ccc} \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \end{array} \right] \left[ \begin{array}{ccc} \underline{\hspace{.2in} } \amp 0 \amp 0 \\ 0 \amp \underline{\hspace{.2in} } \amp 0 \\ 0 \amp 0 \amp \underline{\hspace{.2in} } \end{array} \right]\left[ \begin{array}{ccc} \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \\ \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \amp \underline{\hspace{.2in} } \end{array} \right]\)
Checkpoint 3.12.3.
Given \(\mtx{A} = \left[ \begin{array}{rr} 1 \amp -1 \\ 2 \amp 4 \end{array} \right]\text{,}\) find invertible matrix \(\mtx{P}\) and diagonal matrix \(\mtx{D}\) such that \(\mtx{A} = \mtx{P}\mtx{D}\mtx{P}^{-1}\)
Theorem 3.12.4.
If an \(n\times n\) matrix has n linearly independent eigenvectors then there exists an invertible matrix \(\mtx{P}\) and a diagonal matrix \(\mtx{D}\) such that:
Challenge 3.12.5.
If an \(n\times n\) matrix \(\mtx{A}\) has n linearly independent eigenvectors then det(\(\mtx{A}\)) is the product of the n corresponding eigenvalues.
Discussion Question 3.12.6.
How would Theorem 35 help us efficiently compute \(\mtx{A}^{100}\text{?}\)