Definition
Let
- : square matrix
- : nonzero vector in 1
If for some scalar
Then
- is called an eigenvalue of
- is called an eigenvector corresponding to
Procedure: Finding eigenvalue
Let be matrix of interest.
Based on Proof 1, the procedure is simply to:
- Solve for : 2
Example: Finding eigenvalue
Let
The eigenvalue of is:
The last equation above is true when and . Thus,
Note
The order of eigenvalues doesn’t matter, but in some cases like Singular Value Decomposition, the eigenvalues are sorted in descending order in the columns of .
Procedure: Finding eigenvector
Let be matrix of interest.
- Find eigenvalue(s) of
- Solve for : for each eigenvalue
Example: Finding eigenvector
For :
Note
Since , we have infinitely many solutions. Then \begin{align} \mathbf{x} & = \begin{bmatrix} x_{1} \\ x_{2} \end{bmatrix} \\ & = \begin{bmatrix} x_{1} \\ x_{1} \end{bmatrix} \\ & = x_1\begin{bmatrix} 1 \\ 1 \end{bmatrix} \end{align}
Setting gives the eigenvector . Any other arbitrary is also valid.
For :
Proof 1
Notice that this system always has the trivial solution , but is by definition, a nonzero vector.
Based on (a) and (g) of matrix equivalency statements, the negation of the biconditional statement states ” has nontrivial solutions”.
Therefore, for non-trivial solutions for to exist, determinant of must be , i.e.,