Eigenvalues and Eigenvectors: A Comprehensive Guide

Eigenvalues and eigenvectors are fundamental concepts in linear algebra, unlocking insights into matrix transformations and their applications across mathematics, physics, and engineering. They describe how matrices scale and rotate vectors, revealing intrinsic properties of linear systems. This MathMultiverse guide explores their definitions, computation methods, examples, visualizations, and real-world applications.

Definitions and Concepts

For a square matrix \( A \), an eigenvector \( v \) is a non-zero vector that, when transformed by \( A \), is scaled by a scalar \( \lambda \), known as the eigenvalue. Mathematically:

\[ Av = \lambda v \]

This implies that the direction of \( v \) remains unchanged (or is reversed if \( \lambda < 0 \)), and only its magnitude is scaled by \( \lambda \). To find eigenvalues, rewrite the equation:

\[ (A - \lambda I)v = 0 \]

For non-trivial solutions (\( v \neq 0 \)), the matrix \( A - \lambda I \) must be singular, so:

\[ \det(A - \lambda I) = 0 \]

This is the characteristic equation, a polynomial in \( \lambda \). Its roots are the eigenvalues. For each eigenvalue, solving \( (A - \lambda I)v = 0 \) yields the corresponding eigenvectors. Eigenvalues may be real or complex, and eigenvectors are typically scaled to have integer components or unit length for simplicity.

Computation Methods

Computing eigenvalues and eigenvectors involves these steps:

  1. Form the Characteristic Polynomial: Compute \( \det(A - \lambda I) \). For a 2x2 matrix \( A = \begin{bmatrix} a & b \\ c & d \end{bmatrix} \):
  2. \[ A - \lambda I = \begin{bmatrix} a - \lambda & b \\ c & d - \lambda \end{bmatrix} \]
    \[ \det(A - \lambda I) = (a - \lambda)(d - \lambda) - bc = \lambda^2 - (a + d)\lambda + (ad - bc) = 0 \]
  3. Solve for Eigenvalues: Solve the characteristic polynomial. For the above quadratic, use the quadratic formula:
  4. \[ \lambda = \frac{(a + d) \pm \sqrt{(a + d)^2 - 4(ad - bc)}}{2} \]
  5. Find Eigenvectors: For each \( \lambda \), solve \( (A - \lambda I)v = 0 \). This is a system of linear equations whose non-trivial solutions are the eigenvectors.

For larger matrices (3x3 or 4x4), the characteristic polynomial is higher-degree, and numerical methods (e.g., QR algorithm) or software may be used. Eigenvectors are often scaled to integers by finding the least common multiple of denominators or simplifying via Gaussian elimination.

Examples

2x2 Matrix

Consider \( A = \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix} \):

\[ A - \lambda I = \begin{bmatrix} 2 - \lambda & 1 \\ 1 & 2 - \lambda \end{bmatrix} \]
\[ \det(A - \lambda I) = (2 - \lambda)^2 - 1 = \lambda^2 - 4\lambda + 3 = 0 \]
\[ \lambda = \frac{4 \pm \sqrt{16 - 12}}{2} = 3, 1 \]

For \( \lambda = 3 \):

\[ \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix} \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = 0 \]
\[ v_1 = v_2 \implies v = \begin{bmatrix} 1 \\ 1 \end{bmatrix} \]

For \( \lambda = 1 \):

\[ \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix} \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = 0 \]
\[ v_1 = -v_2 \implies v = \begin{bmatrix} 1 \\ -1 \end{bmatrix} \]

3x3 Matrix

For \( A = \begin{bmatrix} 1 & 2 & 0 \\ 0 & 3 & 0 \\ 0 & 0 & 5 \end{bmatrix} \), a triangular matrix, eigenvalues are the diagonal entries: \( \lambda = 1, 3, 5 \). Eigenvectors are found by solving \( (A - \lambda I)v = 0 \).

Complex Eigenvalues

For a rotation matrix \( A = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix} \):

\[ \det(A - \lambda I) = \lambda^2 + 1 = 0 \implies \lambda = \pm i \]

Eigenvectors involve complex components, reflecting rotational behavior.

Visualization

Matrix Transformation

Transformation of unit vectors by \( A = \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix} \), with eigenvectors \( \begin{bmatrix} 1 \\ 1 \end{bmatrix} \), \( \begin{bmatrix} 1 \\ -1 \end{bmatrix} \).

Applications

  • Physics: In quantum mechanics, eigenvalues of the Hamiltonian represent energy levels, and eigenvectors are the corresponding states.
  • \[ H|\psi\rangle = E|\psi\rangle \]
  • Engineering: In structural analysis, eigenvalues indicate natural frequencies of vibration, and eigenvectors describe mode shapes.
  • \[ [K - \omega^2 M]u = 0 \]
  • Computer Science: In PageRank, Google’s algorithm uses the dominant eigenvector of the web’s link matrix to rank pages.
  • \[ Pr = \lambda Pr \]
  • Data Analysis: Principal Component Analysis (PCA) uses eigenvalues and eigenvectors to reduce dimensionality, identifying principal directions of variance.