Compute the dot product of two vectors in 2D–6D, find the angle between them, check orthogonality and parallelism, and visualize component contributions with an interactive calculator.
The dot product (also called the scalar product or inner product) is one of the most fundamental operations in linear algebra. Given two vectors a and b of the same dimension, the dot product a · b equals the sum of the products of their corresponding components: a₁b₁ + a₂b₂ + … + aₙbₙ. The result is a single scalar, not a vector.
Beyond the algebraic definition, the dot product has a powerful geometric interpretation: a · b = ‖a‖ ‖b‖ cos θ, where θ is the angle between the two vectors. This connection means the dot product simultaneously encodes magnitude and directional information. When a · b = 0, the vectors are orthogonal (perpendicular); when cos θ = ±1, they are parallel or anti-parallel.
This calculator handles vectors from 2D up to 6D. Enter the components of both vectors, choose from six built-in presets, and instantly see the dot product, the angle in degrees and radians, cos θ, individual magnitudes, scalar projection, and orthogonality and parallelism checks. A component products table breaks down how each dimension contributes, and bar charts visualize product magnitudes so you can spot dominant components at a glance.
The dot product appears everywhere: computing work done by a force in physics, measuring cosine similarity in machine learning, projecting one vector onto another in computer graphics, and testing perpendicularity in geometry. Understanding it deeply unlocks much of applied mathematics, data science, and engineering.
Computing the dot product, both magnitudes, the division, and the inverse cosine for the angle is a multi-step process where arithmetic mistakes compound quickly — especially in higher dimensions. This calculator instantly provides the dot product, angle in degrees and radians, orthogonality and parallelism checks, scalar projection, and a component-by-component breakdown with visual bars. It is essential for physics students computing work (W = F·d), ML practitioners evaluating cosine similarity, and anyone needing a quick perpendicularity test.
a · b = Σ aᵢbᵢ = a₁b₁ + a₂b₂ + … + aₙbₙ = ‖a‖ ‖b‖ cos θ
Result: a · b = 32
1×4 + 2×5 + 3×6 = 4 + 10 + 18 = 32. ‖a‖ ≈ 3.742, ‖b‖ ≈ 8.775, cos θ ≈ 0.9746, θ ≈ 12.93°. The vectors are nearly parallel.
The dot product has two equivalent definitions. **Algebraically**, a · b = Σ aᵢbᵢ, the sum of component-wise products. **Geometrically**, a · b = ‖a‖ ‖b‖ cos θ, where θ is the angle between the vectors. The algebraic form is how you compute it; the geometric form is what it means. A special case is a · a = ‖a‖², connecting the dot product to the Euclidean norm. Unlike the cross product, the dot product returns a scalar, works in any dimension, and is commutative (a · b = b · a).
When a · b = 0, the vectors are **orthogonal** (perpendicular), making the dot product the fastest perpendicularity test. The **scalar projection** of a onto b is (a · b)/‖b‖, giving the signed length of a's shadow on b. The **vector projection** scales the unit vector of b by this scalar. In machine learning and NLP, **cosine similarity** — the dot product of unit vectors — measures directional alignment regardless of magnitude, used in document similarity, recommendation engines, and attention mechanisms in transformers.
In physics, **work** W = F · d is the dot product of force and displacement, extracting only the component of force along the direction of motion. In electrical engineering, power is P = V · I for AC circuits using phasor representations. In data science, the dot product underpins linear regression (Xᵀy), principal component analysis, and neural network forward passes (weight · input + bias). The dot product's simplicity and universality make it arguably the single most important operation in applied linear algebra.
The dot product of two vectors is the sum of the products of their corresponding components. It returns a scalar (number), not a vector.
a · b = ‖a‖ ‖b‖ cos θ. Dividing both sides by the magnitudes gives cos θ = (a · b) / (‖a‖ ‖b‖), so the dot product encodes the angle.
The dot product is zero when the two vectors are orthogonal (perpendicular), because cos 90° = 0. Use this as a practical reminder before finalizing the result.
The dot product returns a scalar and works in any dimension. The cross product returns a vector and is defined only in 3D (and 7D). They encode different geometric information.
Yes. A negative dot product means the angle between the vectors is greater than 90°, i.e. they point in generally opposite directions.
Cosine similarity — the normalized dot product — measures how similar two vectors are. It is used in text similarity, recommendation engines, and neural network attention mechanisms.