Eigenvalue & Eigenvector Calculator

Compute eigenvalues and eigenvectors for 2×2 and 3×3 matrices with characteristic polynomial, diagonalization check, spectrum visualization, and detailed calculation steps.

About the Eigenvalue & Eigenvector Calculator

Eigenvalues and eigenvectors are among the most important concepts in linear algebra, with applications spanning physics, engineering, data science, quantum mechanics, and machine learning. An eigenvalue λ and its corresponding eigenvector v satisfy Av = λv, meaning the matrix A acts on v by simply scaling it.

This calculator finds all eigenvalues and eigenvectors for 2×2 and 3×3 real matrices. It computes the characteristic polynomial det(A − λI) = 0, solves for the eigenvalues (roots of this polynomial), and then finds the corresponding eigenvectors by solving (A − λI)v = 0 for each eigenvalue.

For 2×2 matrices, the characteristic polynomial is a quadratic with a closed-form solution. For 3×3 matrices, it becomes a cubic, and the calculator uses Cardano's formula or numerical methods to find the roots. Complex eigenvalues are reported when they arise (e.g., rotation matrices).

The tool also checks whether the matrix is diagonalizable (has n linearly independent eigenvectors), computes the trace (sum of eigenvalues) and determinant (product of eigenvalues) as verification, and displays a spectrum visualization showing eigenvalue locations on a number line. This makes it easy to assess stability (all eigenvalues with negative real part), oscillatory behavior (complex eigenvalues), and conditioning of the matrix.

Why Use This Eigenvalue & Eigenvector Calculator?

Finding eigenvalues requires solving the characteristic polynomial — a quadratic for 2×2 and a cubic for 3×3 — then back-substituting each root into (A − λI)v = 0 to find eigenvectors. The cubic case is especially tricky, and complex eigenvalues demand careful handling. This calculator computes the characteristic polynomial, all eigenvalues (real or complex), the corresponding eigenvectors, and checks diagonalizability, with a spectrum visualization. It is indispensable for students studying spectral theory, engineers analyzing system stability, and data scientists performing PCA.

How to Use This Calculator

  1. Select the matrix size (2×2 or 3×3)
  2. Enter the matrix elements or choose a preset
  3. View the characteristic polynomial and eigenvalues
  4. Examine each eigenvector and verify Av = λv
  5. Check diagonalization status and the eigenvalue spectrum
  6. Study the relationship between trace, determinant, and eigenvalues

Formula

det(A − λI) = 0 gives eigenvalues λ. For each λ, solve (A − λI)v = 0 for eigenvector v.

Example Calculation

Result: λ₁ = 5, v₁ = [1, 1]; λ₂ = 2, v₂ = [−1, 2]

Characteristic polynomial: λ² − 7λ + 10 = (λ−5)(λ−2) = 0. For λ=5: (A−5I)v=0 gives v=[1,1]. For λ=2: (A−2I)v=0 gives v=[−1,2].

Tips & Best Practices

The Characteristic Polynomial

Eigenvalues are the roots of the **characteristic polynomial** det(A − λI) = 0. For a 2×2 matrix [[a,b],[c,d]], this is λ² − (a+d)λ + (ad−bc) = λ² − tr(A)λ + det(A). For 3×3 matrices, it becomes a cubic, solved via Cardano's formula or the trigonometric method. The coefficients encode fundamental matrix properties: the coefficient of λ^(n−1) is −tr(A), and the constant term is (−1)^n det(A). This polynomial bridge connects matrix algebra to root-finding problems.

Eigenvectors and Diagonalization

For each eigenvalue λ, the **eigenvector** is found by solving the homogeneous system (A − λI)v = 0. The set of all eigenvectors for a given λ forms the **eigenspace**, whose dimension is the geometric multiplicity. A matrix is **diagonalizable** if it has n linearly independent eigenvectors, in which case A = PDP⁻¹ where D is diagonal (eigenvalues) and P contains the eigenvectors as columns. Symmetric matrices are always diagonalizable with real eigenvalues and orthogonal eigenvectors.

Applications: Stability, PCA, and Quantum Mechanics

In **dynamical systems** (dx/dt = Ax), eigenvalues determine stability: negative real parts mean decay (stable), positive mean growth (unstable), and imaginary parts mean oscillation. In **data science**, PCA uses the eigenvectors of the covariance matrix as principal components and eigenvalues as variance explained. In **quantum mechanics**, observable quantities correspond to eigenvalues of Hermitian operators, and measurement collapses states to eigenvectors. Google's PageRank algorithm finds the dominant eigenvector of the web link matrix. Eigenanalysis is one of the most widely applied concepts in all of applied mathematics.

Frequently Asked Questions

What are eigenvalues and eigenvectors?

An eigenvalue λ and eigenvector v satisfy Av = λv. The matrix A acts on v by simply scaling it by factor λ, without changing its direction (or reversing it if λ < 0).

What is the characteristic polynomial?

It is det(A − λI), a polynomial in λ of degree n (the matrix size). Its roots are the eigenvalues. For 2×2 matrices it is quadratic; for 3×3 it is cubic.

Can eigenvalues be complex?

Yes. Real matrices can have complex eigenvalues, which always come in conjugate pairs. This happens when the characteristic polynomial has no real roots, common in rotation matrices.

What does diagonalizable mean?

A matrix is diagonalizable if it has n linearly independent eigenvectors. Then A = PDP⁻¹ where D is diagonal (eigenvalues) and P contains the eigenvectors as columns.

How do eigenvalues relate to stability?

For dynamical systems dx/dt = Ax, the system is stable if all eigenvalues have negative real parts, unstable if any has a positive real part, and oscillatory if eigenvalues are complex. Use this as a practical reminder before finalizing the result.

What is the algebraic vs geometric multiplicity?

Algebraic multiplicity is how many times an eigenvalue appears as a root of the characteristic polynomial. Geometric multiplicity is the dimension of the eigenspace. Geometric ≤ algebraic always.

Related Pages