What Is Quantum Error Correction (QEC)?
Table of Contents
This is part of the Quantum Security Reference Deep Dive series. For the full landscape overview, see the capstone article on quantum security.
Introduction
Quantum error correction (QEC) is a set of techniques for protecting quantum computations from the errors that quantum hardware inevitably introduces. Qubits are extraordinarily fragile: thermal noise, electromagnetic interference, material imperfections, and even cosmic rays can corrupt their quantum states. Without error correction, errors accumulate so quickly that any long computation collapses into noise before producing a useful result. QEC is the reason a CRQC is possible in principle, and the primary engineering barrier to building one in practice.
Why Quantum Computers Need Error Correction
Classical computers also experience bit errors, but classical error correction is comparatively simple. A classical bit is either 0 or 1, and you can copy it to check for corruption. Quantum mechanics forbids copying a qubit’s state (the no-cloning theorem), and measuring a qubit destroys the superposition that makes quantum computing powerful. QEC must detect and correct errors without directly observing the quantum information being protected.
The solution, first proposed in the mid-1990s, is to encode a single logical qubit across many physical qubits in a way that allows errors to be detected indirectly. Ancilla (helper) qubits are measured to extract error information, called syndromes, without disturbing the encoded data. A classical decoder then interprets the syndromes and determines which corrections to apply. This process, called syndrome extraction, must run continuously throughout the computation, catching and correcting errors faster than they accumulate.
The Threshold Concept
QEC works only if the physical error rate of the hardware falls below a critical value called the error correction threshold. Above this threshold, adding more physical qubits to encode a logical qubit introduces more errors than it corrects. Below it, adding qubits makes the logical qubit exponentially more reliable.
This threshold is the most important number in quantum computing for anyone tracking the path to a CRQC. My CRQC Capability Framework identifies below-threshold operation as one of the ten capabilities a CRQC requires, and tracks experimental progress toward it across quantum computing platforms.
Recent experiments have demonstrated below-threshold operation on small systems. Google’s Willow processor showed that increasing the surface code distance from 3 to 5 to 7 reduced logical error rates exponentially, the signature behavior of a system operating below threshold. Several other platforms have demonstrated similar scaling. These are small-scale demonstrations, not cryptographic-scale systems, but they confirm that the underlying physics works. I track these milestones in my coverage of experimental QEC below threshold.
Surface Codes and Beyond
The surface code is the most widely studied QEC scheme and the basis for most current CRQC resource estimates. It arranges physical qubits in a two-dimensional grid where each logical qubit is encoded across a patch of physical qubits. The surface code is attractive because it tolerates relatively high error rates (thresholds around 1%) and requires only nearest-neighbor qubit connectivity, which matches the hardware topology of superconducting processors.
The cost is overhead. Under current surface code estimates, encoding one logical qubit requires roughly 1,000 physical qubits. Breaking RSA-2048 with Gidney’s 2025 estimate of 1,399 logical qubits would therefore require on the order of one million physical qubits. This overhead is why resource estimates for cryptographic attacks are so large.
Newer approaches are reducing this overhead. Quantum low-density parity-check (qLDPC) codes encode logical qubits more efficiently, requiring fewer physical qubits per logical qubit at the cost of more complex connectivity. The Pinnacle architecture exploits qLDPC codes to push the RSA-2048 estimate below 100,000 physical qubits. These results depend on assumptions about hardware connectivity that no current processor meets, but they illustrate how quickly the calculus can change when error correction improves.
The Decoder Bottleneck
Error correction generates a continuous stream of syndrome data that must be processed in real time by a classical decoder. If the decoder cannot keep pace with the quantum processor, a backlog forms and uncorrected errors accumulate. This decoder bottleneck is one of the less publicized challenges on the path to a CRQC, and one I consider underappreciated.
My CRQC Capability Framework tracks decoder performance as a separate capability dimension precisely because building faster and more accurate decoders is a distinct engineering problem from building better qubits or better codes.
What This Means for Security Leaders
QEC is the capability that determines the timeline. Qubit counts grab headlines, but a million noisy qubits with poor error correction cannot run Shor’s algorithm. A hundred thousand qubits with excellent error correction might. When evaluating quantum computing announcements, the error correction results are the ones that matter most for assessing whether the quantum threat is advancing.
Every QEC milestone that demonstrates better logical error rates, higher code distances, or more efficient encoding schemes compresses the timeline to a CRQC. The appropriate response is not alarm but preparation: migrate to PQC on the timelines that regulators have already set.
Go Deeper
What Is a Logical Qubit? — the output of error correction
Capability B.1: Quantum Error Correction (QEC) — CRQC Framework deep dive
Quantum Errors and QEC Methods — full technical overview
Surface Code Quantum Error Correction — the dominant QEC approach
Experimental QEC Below Threshold — tracking milestone results
qLDPC Codes — the next generation of error correction
The Decoder Bottleneck — the challenge nobody talks about
Quantum Upside & Quantum Risk - Handled
My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.