Factor a square matrix into lower and upper triangular matrices A=LU with step-by-step Doolittle method, solve Ax=b via forward/back substitution, and verification display.
LU decomposition factors a square matrix A into the product of a lower triangular matrix L and an upper triangular matrix U, so that A = LU. This factorization is the computational backbone of most numerical linear algebra: once you have L and U, solving Ax = b becomes two simple triangular solves — forward substitution for Ly = b, then back substitution for Ux = y — each requiring only O(n²) operations.
This calculator implements the Doolittle algorithm, where L has 1s on the diagonal and U retains the original diagonal scaling. The step-by-step display shows how each element of L and U is computed, with the multipliers (entries of L) clearly identified.
For systems with the same coefficient matrix but different right-hand sides (b vectors), LU decomposition is far more efficient than re-running Gauss-Jordan elimination each time — you factor once and solve repeatedly. This is why LU is the workhorse method in engineering simulations and scientific computing.
The calculator handles matrices from 2×2 to 5×5, with optional partial pivoting (row swaps) when a zero pivot is encountered. It also computes the determinant from the diagonal of U (since det(A) = det(L)·det(U) = product of diagonal entries of U when L has unit diagonal), and verifies that L·U reproduces the original matrix A.
LU decomposition requires careful bookkeeping of multipliers and pivots across n² entries — a single incorrect multiplier corrupts all subsequent rows. The savings come from factoring once and reusing L and U for many right-hand sides, but the factoring step itself is where mistakes occur. This calculator shows every multiplier, pivot swap, and row operation, displays the final L and U, computes the determinant from U’s diagonal, and solves Ax = b via forward and back substitution. It is the preferred tool for students learning triangular factorization and engineers verifying decompositions.
A = LU where L is lower triangular with 1s on diagonal, U is upper triangular. Solve: Ly = b (forward), Ux = y (backward).
Result: L = [[1,0,0],[2,1,0],[4,3,1]], U = [[2,1,1],[0,1,1],[0,0,2]]
Row 2: l₂₁=4/2=2, subtract 2×R1 from R2. Row 3: l₃₁=8/2=4, l₃₂=(7-4·1)/1=3, giving the final U.
The Doolittle method constructs L with 1s on the diagonal and stores the elimination multipliers below. For each column k, the algorithm computes the off-diagonal entries of L (lᵢₖ = aᵢₖ / uₖₖ for i > k) and the entries of U (uₖⱼ = aₖⱼ − Σₘ<ₖ lₖₘuₘⱼ for j ≥ k). Each multiplier records how much of row k was subtracted from row i during elimination — this is why LU decomposition is equivalent to Gaussian elimination with the operations recorded in L.
When a zero (or very small) pivot is encountered, **partial pivoting** swaps rows to place the largest absolute value in the pivot position, improving numerical stability. The factorization becomes PA = LU where P is a permutation matrix recording the row swaps. Modern numerical libraries (LAPACK, NumPy) always use partial pivoting. The determinant is then det(A) = (−1)^s · ∏ uᵢᵢ, where s is the number of row swaps.
The factorization phase costs ≈ 2n³/3 floating-point operations, comparable to Gaussian elimination. The key advantage is **reusability**: each subsequent solve for a new right-hand side b costs only 2n² operations (forward + back substitution). This makes LU decomposition the standard method in circuit simulation, structural analysis, and any application where the same system matrix is solved with many different loads or inputs. For symmetric positive definite matrices, Cholesky decomposition (A = LLᵀ) is about twice as fast.
LU decomposition factors a square matrix A into a lower triangular matrix L (with 1s on the diagonal) and an upper triangular matrix U, such that A = LU. It is equivalent to Gaussian elimination with the multipliers stored in L.
The Doolittle algorithm is a specific LU factorization where L has 1s on the diagonal. An alternative is the Crout method where U has 1s on the diagonal. Both produce valid LU factorizations.
First solve Ly = b by forward substitution (top to bottom), then solve Ux = y by back substitution (bottom to top). Each step is O(n²), making the total solve O(n²) after the O(n³) factorization.
If a zero pivot is encountered, row swapping (partial pivoting) is needed. The factorization becomes PA = LU where P is a permutation matrix. This calculator handles this automatically.
The factorization A = LU costs O(n³) once. Each subsequent solve for a new b costs only O(n²). Gauss-Jordan would cost O(n³) for every new b.
Since det(A) = det(L)·det(U), and det(L)=1 (unit diagonal), det(A) = product of diagonal entries of U. This gives a fast O(n³) determinant computation.