Apply the Gram-Schmidt process to 2–4 vectors in R², R³, or R⁴. View step-by-step projections, orthonormal results, dot-product verification matrix, and norm comparison bars.
The Gram-Schmidt process is a fundamental algorithm in linear algebra that converts a set of linearly independent vectors into an orthogonal (or orthonormal) set spanning the same subspace. Given input vectors v₁, v₂, …, vₖ, it produces u₁, u₂, …, uₖ such that every pair uᵢ · uⱼ = 0 for i ≠ j. Optionally normalizing each uᵢ to unit length yields an orthonormal basis.
This calculator implements the classical Gram-Schmidt algorithm for vectors in R², R³, and R⁴ with up to four input vectors. Enter your vectors component-by-component or select from built-in presets that demonstrate common scenarios — standard bases, physics-motivated triples, and higher-dimensional examples.
Each step is displayed in a projections table showing which vector is being projected onto which orthogonal vector, the resulting projection vector, and its magnitude. After the process completes, the output shows the full orthogonal set with norms and, if requested, the orthonormal set with unit-length verification.
A color-coded dot-product verification matrix confirms orthogonality: off-diagonal entries should be zero (green), while diagonal entries (blue) show each vector's self-dot-product. Norm comparison bars let you quickly see relative magnitudes across the orthogonal set, revealing how the process redistributes length among the basis vectors.
The Gram-Schmidt process underpins QR decomposition, least-squares approximation, and many numerical methods. Understanding it visually through step-by-step projections builds intuition for how orthogonal bases simplify computations across mathematics, physics, and engineering.
The Gram-Schmidt process involves iterative projections and subtractions that accumulate rounding errors, especially with near-parallel input vectors. Keeping track of which projection has been subtracted from which vector quickly becomes overwhelming for more than two vectors. This calculator performs the full orthogonalization, displays every projection step, verifies orthogonality via a dot-product matrix, and normalizes the output to produce an orthonormal basis. It is essential for students learning QR decomposition, engineers constructing coordinate frames, and anyone building orthonormal bases for least-squares problems.
uₖ = vₖ − Σⱼ₌₁ᵏ⁻¹ proj_{uⱼ}(vₖ), where proj_u(v) = (v · u / u · u) u; êₖ = uₖ / ‖uₖ‖
Result: u₁ = (1, 1, 1), u₂ = (⅓, ⅓, −⅔), u₃ = (½, −½, 0)
u₁ = v₁. u₂ = v₂ − proj_{u₁}(v₂) = (1,1,0) − (⅔)(1,1,1) = (⅓,⅓,−⅔). u₃ = v₃ − proj_{u₁}(v₃) − proj_{u₂}(v₃) = (½,−½,0). Dot products u₁·u₂ = 0, u₁·u₃ = 0, u₂·u₃ = 0 — confirmed orthogonal.
Given input vectors v₁, v₂, …, vₖ, the classical Gram-Schmidt process computes orthogonal vectors u₁, u₂, …, uₖ iteratively. Set u₁ = v₁. For each subsequent vector vₖ, subtract its projections onto all previously computed orthogonal vectors: uₖ = vₖ − Σⱼ₌₁ᵏ⁻¹ proj_{uⱼ}(vₖ), where proj_u(v) = (v·u / u·u)u. Each subtraction removes the component of vₖ in the direction of uⱼ, leaving only the component orthogonal to all previous vectors. Normalizing each uₖ to unit length yields an orthonormal basis.
The classical algorithm is numerically unstable for near-parallel vectors: accumulated floating-point errors cause the output vectors to lose orthogonality. The **modified Gram-Schmidt** variant improves stability by re-projecting against the partially orthogonalized vector after each subtraction rather than against the original vₖ. In exact arithmetic both versions produce identical results, but modified Gram-Schmidt maintains orthogonality to machine precision in practice. For even better stability, Householder reflections (used in QR factorization) are preferred.
Gram-Schmidt is the constructive proof behind **QR decomposition**: A = QR, where Q’s columns are the orthonormal vectors and R is upper triangular with entries rᵢⱼ = qᵢ · vⱼ. QR decomposition is used in least-squares regression (solving overdetermined systems), eigenvalue algorithms (QR iteration), and solving linear systems that are ill-conditioned. Understanding the Gram-Schmidt process provides concrete intuition for why orthonormal bases simplify so many computations — projections become dot products, and least-squares solutions reduce to back substitution.
It is an algorithm that takes a set of linearly independent vectors and produces an orthogonal (or orthonormal) set spanning the same subspace by iteratively subtracting projections onto previously computed orthogonal vectors. Use this as a practical reminder before finalizing the result.
One or more output vectors will be the zero vector, indicating that the corresponding input vector was a linear combination of previous vectors and contributes no new direction to the span. Keep this note short and outcome-focused for reuse.
Orthogonal means mutually perpendicular (dot products are zero); orthonormal additionally requires each vector to have unit length (magnitude 1). Apply this check where your workflow is most sensitive.
In QR decomposition, Q contains the orthonormal vectors from Gram-Schmidt as columns, and R contains the projection coefficients that express the original vectors in the new basis. Use this checkpoint when values look unexpected.
It fails only if the input vectors are linearly dependent. In finite precision arithmetic, near-dependent vectors can cause numerical instability, which the modified Gram-Schmidt algorithm addresses.
Each vector is orthogonalized against previously computed vectors, so changing the order changes which projections are subtracted at each step, producing a different orthogonal basis for the same span. Validate assumptions before taking action on this output.