Compute eigenvalues and eigenvectors for 2×2 and 3×3 matrices with characteristic polynomial, diagonalization check, spectrum visualization, and detailed calculation steps.
Eigenvalues and eigenvectors are among the most important concepts in linear algebra, with applications spanning physics, engineering, data science, quantum mechanics, and machine learning. An eigenvalue λ and its corresponding eigenvector v satisfy Av = λv, meaning the matrix A acts on v by simply scaling it.
This calculator finds all eigenvalues and eigenvectors for 2×2 and 3×3 real matrices. It computes the characteristic polynomial det(A − λI) = 0, solves for the eigenvalues (roots of this polynomial), and then finds the corresponding eigenvectors by solving (A − λI)v = 0 for each eigenvalue.
For 2×2 matrices, the characteristic polynomial is a quadratic with a closed-form solution. For 3×3 matrices, it becomes a cubic, and the calculator uses Cardano's formula or numerical methods to find the roots. Complex eigenvalues are reported when they arise (e.g., rotation matrices).
The tool also checks whether the matrix is diagonalizable (has n linearly independent eigenvectors), computes the trace (sum of eigenvalues) and determinant (product of eigenvalues) as verification, and displays a spectrum visualization showing eigenvalue locations on a number line. This makes it easy to assess stability (all eigenvalues with negative real part), oscillatory behavior (complex eigenvalues), and conditioning of the matrix.
Finding eigenvalues requires solving the characteristic polynomial — a quadratic for 2×2 and a cubic for 3×3 — then back-substituting each root into (A − λI)v = 0 to find eigenvectors. The cubic case is especially tricky, and complex eigenvalues demand careful handling. This calculator computes the characteristic polynomial, all eigenvalues (real or complex), the corresponding eigenvectors, and checks diagonalizability, with a spectrum visualization. It is indispensable for students studying spectral theory, engineers analyzing system stability, and data scientists performing PCA.
det(A − λI) = 0 gives eigenvalues λ. For each λ, solve (A − λI)v = 0 for eigenvector v.
Result: λ₁ = 5, v₁ = [1, 1]; λ₂ = 2, v₂ = [−1, 2]
Characteristic polynomial: λ² − 7λ + 10 = (λ−5)(λ−2) = 0. For λ=5: (A−5I)v=0 gives v=[1,1]. For λ=2: (A−2I)v=0 gives v=[−1,2].
Eigenvalues are the roots of the **characteristic polynomial** det(A − λI) = 0. For a 2×2 matrix [[a,b],[c,d]], this is λ² − (a+d)λ + (ad−bc) = λ² − tr(A)λ + det(A). For 3×3 matrices, it becomes a cubic, solved via Cardano's formula or the trigonometric method. The coefficients encode fundamental matrix properties: the coefficient of λ^(n−1) is −tr(A), and the constant term is (−1)^n det(A). This polynomial bridge connects matrix algebra to root-finding problems.
For each eigenvalue λ, the **eigenvector** is found by solving the homogeneous system (A − λI)v = 0. The set of all eigenvectors for a given λ forms the **eigenspace**, whose dimension is the geometric multiplicity. A matrix is **diagonalizable** if it has n linearly independent eigenvectors, in which case A = PDP⁻¹ where D is diagonal (eigenvalues) and P contains the eigenvectors as columns. Symmetric matrices are always diagonalizable with real eigenvalues and orthogonal eigenvectors.
In **dynamical systems** (dx/dt = Ax), eigenvalues determine stability: negative real parts mean decay (stable), positive mean growth (unstable), and imaginary parts mean oscillation. In **data science**, PCA uses the eigenvectors of the covariance matrix as principal components and eigenvalues as variance explained. In **quantum mechanics**, observable quantities correspond to eigenvalues of Hermitian operators, and measurement collapses states to eigenvectors. Google's PageRank algorithm finds the dominant eigenvector of the web link matrix. Eigenanalysis is one of the most widely applied concepts in all of applied mathematics.
An eigenvalue λ and eigenvector v satisfy Av = λv. The matrix A acts on v by simply scaling it by factor λ, without changing its direction (or reversing it if λ < 0).
It is det(A − λI), a polynomial in λ of degree n (the matrix size). Its roots are the eigenvalues. For 2×2 matrices it is quadratic; for 3×3 it is cubic.
Yes. Real matrices can have complex eigenvalues, which always come in conjugate pairs. This happens when the characteristic polynomial has no real roots, common in rotation matrices.
A matrix is diagonalizable if it has n linearly independent eigenvectors. Then A = PDP⁻¹ where D is diagonal (eigenvalues) and P contains the eigenvectors as columns.
For dynamical systems dx/dt = Ax, the system is stable if all eigenvalues have negative real parts, unstable if any has a positive real part, and oscillatory if eigenvalues are complex. Use this as a practical reminder before finalizing the result.
Algebraic multiplicity is how many times an eigenvalue appears as a root of the characteristic polynomial. Geometric multiplicity is the dimension of the eigenspace. Geometric ≤ algebraic always.