Section 3.11 When Eigenvalues are Distinct
Checkpoint 3.11.1.
Compute the eigenvalues and a representative eigenvector for \(\mtx{A} = \left[ \begin{array}{rr} 1 \amp 2 \\ 3 \amp -4 \end{array} \right]\) and for \(2\mtx{A} = \left[ \begin{array}{rr} 2 \amp 4 \\ 6 \amp -8 \end{array} \right]\text{.}\) What do you notice?
Discussion Question 3.11.2.
For a given eigenvalue can there be just one corresponding eigenvector? Usually books talk about eigenvectors as if there were just one, why doesn't this cause problems?
Checkpoint 3.11.3.
Compute the eigenvalues and eigenvectors for \(\mtx{A} = \left[ \begin{array}{rr} 1 \amp 0 \\ 2 \amp 3 \end{array} \right]\) .
Checkpoint 3.11.4.
Compute the eigenvalues and eigenvectors for \(\mtx{A} = \left[ \begin{array}{rrr} 1 \amp 2 \amp 3 \\ 0 \amp 4 \amp 5 \\ 0 \amp 0 \amp 6 \end{array} \right]\text{.}\)
Theorem 3.11.5.
Suppose \(\mtx{A}\) is a square matrix with eigenvectors \(\vect{v}\) and \(\vect{w}\text{,}\) and corresponding eigenvalues \(\lambda_\vect{v}\) and \(\lambda_\vect{w}\text{.}\) If \(\lambda_\vect{v}\ \neq \lambda_\vect{w}\) then \(\vect{v}\) and \(\vect{w}\) are linearly independent.
Challenge 3.11.6.
Given a square matrix \(\mtx{A}\) with three eigenvectors \(\vect{u}\text{,}\)\(\vect{v}\) and \(\vect{w}\text{,}\)having distinct corresponding eigenvalues \(\lambda_\vect{u}\text{,}\) \(\lambda_\vect{v}\) and \(\lambda_\vect{w}\text{,}\) show \(\vect{u}\text{,}\)\(\vect{v}\) and \(\vect{w}\) are linearly independent.