Apply Yates continuity correction for binomial-to-normal and Poisson-to-normal approximations with exact comparison, z-scores, and CDF tables.
The continuity correction (also known as Yates' continuity correction) bridges the gap between discrete probability distributions and their continuous normal approximation. When you approximate a binomial or Poisson distribution with a normal curve, you lose the inherent "width" of each discrete probability bar — each integer value x occupies the interval [x−0.5, x+0.5] on the number line.
This calculator applies the appropriate ±0.5 correction for five probability types: P(X ≤ x), P(X ≥ x), P(X < x), P(X > x), and P(X = x). It computes z-scores both with and without the correction, compares the normal approximation against exact binomial probabilities, and shows a comparison table for nearby values so you can see exactly how much the correction improves accuracy.
The calculator supports binomial → normal, Poisson → normal, and a general discrete CDF mode. Presets demonstrate common scenarios including coin flipping, surveys, rare events, and election polling. By comparing corrected vs. uncorrected results side by side with exact probabilities, you develop strong intuition for when the correction matters most — typically when n is small or p is far from 0.5.
Textbooks teach the continuity correction, but applying it correctly requires knowing which direction to adjust (+0.5 or −0.5) for each type of probability — and this is where errors commonly occur. This calculator eliminates guesswork by automatically applying the correct rule and showing you the exact correction used.
The side-by-side comparison of corrected, uncorrected, and exact probabilities builds lasting intuition. Students can explore how the correction's impact changes with n, p, and x, while practitioners can verify their manual calculations against exact results.
Continuity Correction Rules: P(X ≤ x) → P(Z ≤ (x + 0.5 − μ)/σ) P(X ≥ x) → P(Z ≥ (x − 0.5 − μ)/σ) P(X < x) → P(Z < (x − 0.5 − μ)/σ) P(X > x) → P(Z > (x + 0.5 − μ)/σ) P(X = x) → P((x−0.5−μ)/σ < Z < (x+0.5−μ)/σ) For Binomial: μ = np, σ = √(np(1−p)) For Poisson: μ = λ, σ = √λ
Result: Without CC: Z = 1.000, P = 0.8413; With CC: Z = 1.100, P = 0.8643; Exact: 0.8644
For 100 coin flips, P(X ≤ 55) with the continuity correction gives 0.8643, almost exactly matching the true binomial probability of 0.8644. Without correction, the approximation is 0.8413 — off by 0.023.
The normal distribution is a continuous curve with probability spread over every real number. The binomial and Poisson distributions assign probability only to integers. When you draw the binomial probability histogram and overlay the normal curve, each bar covers the interval [k−0.5, k+0.5]. To find P(X ≤ 5), you need the area up to 5.5, not just up to 5 — otherwise you're cutting off half of the bar at x = 5.
This geometric insight is the continuity correction: adjust by ±0.5 to align the continuous approximation with the discrete bars.
The normal approximation to the binomial dates to Abraham de Moivre (1733) and Pierre-Simon Laplace. The formal continuity correction was popularized by Frank Yates in 1934 in the context of chi-squared tests. For decades, when computation was expensive, the corrected normal approximation was the standard method for computing binomial probabilities. Today, with exact computation cheap, the correction is primarily a pedagogical tool — but understanding it deepens your grasp of the relationship between discrete and continuous distributions.
For large samples (n > 200) or probabilities near the center of the distribution, the correction changes only the 4th or 5th decimal place. In practice, skip it when exact computation is available or when using software that computes exact binomial CDFs. But always apply it on exams and in textbook exercises unless told otherwise — and always when computing by hand with z-tables.
Apply it whenever you approximate a discrete distribution (binomial, Poisson, hypergeometric) with a continuous normal distribution. It's most important when n is small (< 50), p is far from 0.5, or when computing probabilities at specific values.
Each discrete probability at integer x corresponds to the interval [x−0.5, x+0.5] on the continuous number line. P(X ≤ 3) includes 3, so you use 3.5 as the upper bound. P(X ≥ 3) starts at 3, so you use 2.5 as the lower bound. This captures the full "width" of each bar.
In rare cases with very large n and p near 0.5, the uncorrected approximation may be closer to the exact value. However, the correction never makes things significantly worse and usually improves accuracy substantially.
They share the same principle. Yates' chi-square correction subtracts 0.5 from |O−E| in 2×2 contingency tables. This calculator applies the correction to normal approximations of discrete distributions, which is the more general form.
Yes. When approximating Poisson probabilities with a normal distribution (especially for λ > 10), the continuity correction improves accuracy for the same reason — the Poisson is discrete but the normal is continuous.
The rule of thumb is np ≥ 5 and n(1−p) ≥ 5 for a reasonable approximation. With the continuity correction, even borderline cases (np ≈ 5) produce good results. For np ≥ 10, the corrected approximation is typically accurate to 3+ decimal places.