Calculate the Hadamard (element-wise) product of two matrices. Compare with standard multiplication, view properties, and explore visual breakdowns.
The Hadamard product — also called the Schur product or element-wise product — multiplies two matrices of the same dimension entry by entry: (A ∘ B)ᵢⱼ = aᵢⱼ × bᵢⱼ. Unlike standard matrix multiplication, the Hadamard product is commutative, meaning A ∘ B always equals B ∘ A. It also preserves the dimensions of the input matrices, making it conceptually simpler than the dot product of matrices.
Despite its simplicity, the Hadamard product appears throughout mathematics and applied science. In statistics, it is used to compute element-wise variance products and in multivariate analysis. In signal processing, applying a window function to a signal is a Hadamard product. Neural networks use element-wise multiplication extensively — gating mechanisms in LSTMs and attention layers rely on it. Image processing applies masks and filters via the Hadamard product.
This calculator supports 2×2, 3×3, and 4×4 matrices. Enter your values or load a preset, and instantly see the result matrix with color-coded cells showing magnitude, a side-by-side properties comparison with standard multiplication, and a complete element-by-element breakdown table. Visual magnitude bars help you quickly identify the dominant entries in the product.
Standard matrix multiplication involves summing products across rows and columns, which changes dimensions and is non-commutative. The Hadamard product keeps things simple — same-position entries are multiplied directly. This calculator shows both side by side, so you can instantly see the difference. It is invaluable for students learning the distinction, data scientists applying element-wise operations in NumPy or PyTorch, and engineers working with masking or gating operations.
Hadamard Product: (A ∘ B)ᵢⱼ = aᵢⱼ × bᵢⱼ for all i, j. Both matrices must have identical dimensions m × n.
Result: [[5, 12], [21, 32]]
Each element is multiplied position-wise: 1×5=5, 2×6=12, 3×7=21, 4×8=32. The sum of the result is 70.
The Hadamard product, denoted A ∘ B, is the simplest matrix "multiplication" — each entry is the product of the same-position entries in the two input matrices. Formally, for two m×n matrices A and B, the product C = A ∘ B is also m×n with Cᵢⱼ = Aᵢⱼ × Bᵢⱼ. Named after French mathematician Jacques Hadamard, this operation has elegant algebraic properties: it is commutative, associative, and distributes over addition. The Schur product theorem guarantees that if A and B are both positive semi-definite, then A ∘ B is also positive semi-definite.
In modern deep learning, the Hadamard product is everywhere. LSTM cells use element-wise gating: the forget gate multiplies (Hadamard) the cell state to selectively erase information. Transformer attention mechanisms compute element-wise products when applying masks. Dropout training can be viewed as a Hadamard product with a random binary mask. Batch normalization involves element-wise scaling and shifting. Frameworks like PyTorch and TensorFlow use the * operator for Hadamard products on tensors, making it the most common operation after addition.
Beyond the Hadamard and standard products, there are several other matrix products. The Kronecker product A ⊗ B produces a block matrix of size (mp × nq) for an m×n and p×q matrix. The Khatri-Rao product is a column-wise Kronecker product used in tensor decompositions. The outer product of two vectors produces a rank-1 matrix. Each product has different algebraic properties and applications — the Hadamard product's simplicity makes it the most intuitive starting point for students learning matrix operations.
The Hadamard product (also called the Schur product) is the element-wise multiplication of two matrices of the same size. Each entry in the result equals the product of the corresponding entries: (A ∘ B)ᵢⱼ = aᵢⱼ × bᵢⱼ.
Standard multiplication sums products of rows and columns (AB)ᵢⱼ = Σ aᵢₖbₖⱼ and requires compatible inner dimensions. The Hadamard product multiplies matching positions directly and requires identical dimensions. Standard multiplication is not commutative; the Hadamard product always is.
It is widely used in neural networks (gating in LSTMs, attention mechanisms), signal processing (windowing), image processing (masking), and statistics (element-wise variance computations). In NumPy, the * operator performs the Hadamard product on arrays.
Yes, always. Since (A ∘ B)ᵢⱼ = aᵢⱼ × bᵢⱼ = bᵢⱼ × aᵢⱼ = (B ∘ A)ᵢⱼ, the operation is commutative for all matrices of the same dimensions. This is a key difference from standard matrix multiplication.
The identity element is the matrix J where every entry equals 1. This is because aᵢⱼ × 1 = aᵢⱼ for all entries. Note this differs from standard multiplication, whose identity is the diagonal matrix I.
No, both matrices must have exactly the same dimensions (same number of rows and same number of columns). There is no Hadamard product defined for matrices of different sizes.