What Is a Logical Qubit?
Table of Contents
This is part of the Quantum Security Reference Deep Dive series. For the full landscape overview, see the capstone article on quantum security.
Introduction
A logical qubit is an error-corrected unit of quantum information, constructed from many physical qubits working together to protect the encoded data against hardware errors. Physical qubits are the raw hardware: superconducting circuits, trapped ions, neutral atoms, or photonic elements. Logical qubits are what you build from them once you apply quantum error correction. The distinction matters because every meaningful quantum computation, including running Shor’s algorithm to break cryptography, is measured in logical qubits, not physical ones.
Why the Distinction Matters
When a company announces a 1,000-qubit processor, those are physical qubits. Physical qubits are noisy. Each gate operation, each measurement, each moment a qubit sits idle introduces a small probability of error. Over a computation requiring millions of operations, these errors compound and destroy the result.
A logical qubit absorbs this noise by distributing quantum information across a group of physical qubits in a pattern that allows errors to be detected and corrected continuously. The encoding is designed so that no single physical qubit failure corrupts the logical information. Think of it as a RAID array for quantum data: individual disks can fail without losing the stored information, because redundancy is built into the encoding.
The ratio between physical and logical qubits is the overhead, and it is steep. Under current surface code implementations, encoding one logical qubit requires on the order of 1,000 physical qubits, though the exact number depends on the target error rate and the code distance chosen. A computation needing 1,400 logical qubits (the Gidney 2025 estimate for breaking RSA-2048) therefore requires roughly one million physical qubits, plus additional qubits for ancilla measurements and magic state factories.
This is why a headline announcing “1,000 qubits” does not mean a machine capable of 1,000 logical qubits of computation. It means a machine that might, under ideal conditions, support one or two logical qubits at modest code distances.
How the Overhead Is Shrinking
The 1,000-to-1 ratio is not fixed. It reflects the current state of surface code error correction with current hardware error rates. Both sides of the equation are improving.
Hardware error rates are dropping. As physical qubits become more reliable, each logical qubit needs fewer physical qubits for the same level of protection. The error correction threshold sets the boundary: as long as physical error rates stay below the threshold, better hardware translates directly into lower overhead.
New error correction codes are more efficient. Quantum low-density parity-check (qLDPC) codes can encode logical qubits with significantly fewer physical qubits than surface codes, at the cost of requiring more complex qubit connectivity. The Pinnacle architecture uses qLDPC codes to estimate that RSA-2048 could be broken with fewer than 100,000 physical qubits. If validated, that would represent a tenfold reduction in the hardware required.
Hardware-native error suppression is another avenue. Some qubit types, like the cat qubits developed by Alice & Bob, are engineered to suppress one type of error at the physical level, reducing the burden on error correction codes and lowering the logical-to-physical overhead.
Reading the Headlines Critically
With this distinction in hand, quantum computing announcements become easier to evaluate.
When a company reports a qubit count, ask whether those are physical or logical qubits. If physical (as they almost always are), ask what logical qubit count the system can support and at what error rate. A 1,000-physical-qubit machine operating above the error correction threshold supports zero useful logical qubits. A 100-physical-qubit machine operating well below threshold might support a handful of high-quality logical qubits, which is more meaningful for the path to a CRQC.
When a paper reports a logical qubit demonstration, ask at what code distance and what logical error rate. A logical qubit at code distance 3 with a logical error rate of 1% is a proof of concept. A logical qubit at code distance 7 with a logical error rate of 10⁻⁸ is a step toward fault-tolerant computing.
My CRQC Quantum Capability Framework is designed to evaluate exactly these claims, placing each result in the context of the ten capabilities a CRQC requires.
Go Deeper
CRQC Quantum Capability Framework — evaluating quantum progress systematically
The Rise of Logical Qubits — full deep dive on logical qubit progress
What Is Quantum Error Correction? — the technique that creates logical qubits
Surface Code QEC — the dominant error correction approach
qLDPC Codes — reducing the overhead
Quantum Breakthrough for RSA-2048 (Gidney 2025) — what the logical qubit requirements actually are
Quantum Upside & Quantum Risk - Handled
My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.