QR Decomposition Calculator

Compute QR decomposition via Gram-Schmidt. View orthogonal Q and upper triangular R with step-by-step process, verification, and application reference.

About the QR Decomposition Calculator

The QR decomposition factors a square matrix A into the product A = QR, where Q is an orthogonal matrix (its columns are orthonormal) and R is upper triangular. This factorization is one of the most important tools in numerical linear algebra, underpinning efficient algorithms for solving linear systems, computing eigenvalues, and performing least-squares regression.

The classical approach uses the Gram-Schmidt process: starting with the columns of A, each column is orthogonalized against all previous columns to build Q, and the projection coefficients form R. The result is a set of orthonormal vectors that span the same column space as A, paired with an upper triangular matrix that encodes the transformation.

This calculator performs Gram-Schmidt QR decomposition for 2×2 and 3×3 matrices. It shows the complete step-by-step process including each projection and normalization, verifies A = QR and QᵀQ = I, displays properties of both Q and R, and provides an application reference table showing how QR is used in practice. Load one of the presets to see a worked example instantly.

Why Use This QR Decomposition Calculator?

The Gram-Schmidt process involves repeated projections and normalizations that are prone to arithmetic mistakes when done by hand. Even a small error in an early step propagates through all subsequent orthogonalizations. This calculator performs the entire process with full numerical precision, shows every intermediate step, and automatically verifies the result satisfies A = QR and QᵀQ = I. It is ideal for homework, exam preparation, and quick verification of hand computations.

How to Use This Calculator

  1. Select the matrix size (2×2 or 3×3)
  2. Enter entries for matrix A or click a preset to load an example
  3. View Q (orthogonal) and R (upper triangular) in the result matrices
  4. Check the verification outputs to confirm A = QR and QᵀQ = I
  5. Read the Gram-Schmidt steps to understand each projection
  6. Use the R diagonal bars to visualize the magnitude of diagonal entries
  7. Refer to the applications table for real-world uses of QR

Formula

QR Decomposition: A = QR where Q is orthogonal (QᵀQ = I) and R is upper triangular. Gram-Schmidt: eⱼ = (aⱼ − Σᵢ₌₁ʲ⁻¹ ⟨aⱼ, eᵢ⟩eᵢ) / ‖...‖, Rᵢⱼ = ⟨eᵢ, aⱼ⟩.

Example Calculation

Result: Q ≈ [[0.857,−0.394,0.331],[0.429,0.903,0.034],[−0.286,0.171,0.943]], R ≈ [[14,21,−14],[0,175,−70],[0,0,−35]]

Gram-Schmidt orthogonalizes the three columns of A. The resulting Q has orthonormal columns (each has unit length, and columns are perpendicular), while R captures the projection coefficients.

Tips & Best Practices

The Gram-Schmidt Process in Detail

Starting with columns a₁, a₂, ..., aₙ of matrix A, the Gram-Schmidt process constructs orthonormal vectors e₁, e₂, ..., eₙ. First, e₁ = a₁/‖a₁‖. Then for each subsequent column aⱼ, compute uⱼ = aⱼ − Σᵢ₌₁ʲ⁻¹ ⟨aⱼ, eᵢ⟩eᵢ (subtract projections) and normalize: eⱼ = uⱼ/‖uⱼ‖. The matrix Q = [e₁ | e₂ | ... | eₙ] has orthonormal columns, and R is upper triangular with Rᵢⱼ = ⟨eᵢ, aⱼ⟩ for i ≤ j. Classical Gram-Schmidt can lose orthogonality due to floating-point errors; the modified version recomputes projections against the already-updated vectors.

QR in Numerical Linear Algebra

QR decomposition is a workhorse: it is used for solving linear systems (more stable than LU for ill-conditioned problems), least-squares regression (the standard algorithm in R's lm() and Python's numpy.linalg.lstsq), eigenvalue computation (QR algorithm), and condition number estimation. The Householder variant requires about 2n³/3 flops and is backward stable, making it the default implementation in LAPACK (dgeqrf). Givens rotations are an alternative that zeros one element at a time, useful for sparse or banded matrices.

From QR to Least Squares

For an overdetermined system Ax ≈ b (more equations than unknowns), the least-squares solution minimizes ‖Ax − b‖₂. Using QR: write A = QR, then ‖Ax − b‖₂ = ‖Rx − Qᵀb‖₂ because Q preserves norms. Minimizing this reduces to solving the triangular system Rx = Qᵀb via back-substitution. This is cheaper and more stable than forming the normal equations AᵀAx = Aᵀb, which squares the condition number.

Frequently Asked Questions

What is QR decomposition?

QR decomposition factors a matrix A as A = QR, where Q has orthonormal columns (QᵀQ = I) and R is upper triangular (all entries below the diagonal are zero). It exists for every square matrix and extends to rectangular matrices.

What is the Gram-Schmidt process?

Gram-Schmidt takes a set of linearly independent vectors and produces an orthonormal set spanning the same space. For each new vector, it subtracts the projections onto all previous orthonormal vectors and normalizes the result.

Why is QR decomposition useful for solving systems?

Given Ax = b, substitute A = QR to get QRx = b, then Rx = Qᵀb (since QᵀQ = I). The system Rx = Qᵀb is upper triangular and can be solved by back-substitution in O(n²) time.

How is QR related to eigenvalue computation?

The QR algorithm iteratively decomposes Aₖ = QₖRₖ and forms Aₖ₊₁ = RₖQₖ. Under mild conditions, this converges to an upper triangular (Schur) form whose diagonal entries are the eigenvalues.

What is the difference between Gram-Schmidt and Householder QR?

Gram-Schmidt builds Q column by column via projections. Householder uses reflections to zero out sub-diagonal entries. Householder is more numerically stable and is the default in most numerical libraries.

Does QR decomposition work for rectangular matrices?

Yes. For an m×n matrix with m ≥ n, the "thin" QR gives Q as m×n with orthonormal columns and R as n×n upper triangular. This is the standard approach for least-squares problems.

Related Pages