4,099 Qubits: The Myth and Reality of Breaking RSA-2048 with Quantum Computers
Table of Contents
It’s a figure that crops up in countless discussions about quantum computing and cybersecurity: 4,099 qubits. That’s the widely cited number of quantum bits one would need to factor a 2048-bit RSA key using Shor’s algorithm – in other words, the notional threshold at which a quantum computer could crack one of today’s most common encryption standards. The claim has an alluring simplicity: if we could just build a quantum machine with a few thousand perfect qubits, decades of RSA-protected secrets would fall in seconds. But where does this “4,099 logical qubits” figure actually come from, and what does it really mean? The story behind it reveals both how far quantum algorithms have come and how much further quantum hardware needs to go.
The Origin of the “4,099 Logical Qubits” Figure
The magic number 4,099 has its roots in theoretical work from the early 2000s, as researchers began translating Peter Shor’s 1994 factoring algorithm into practical quantum circuit designs. In 2003, computer scientist Stéphane Beauregard showed how to implement Shor’s algorithm using as few qubits as possible. His paper, titled “Circuit for Shor’s algorithm using 2n+3 qubits,” gave a recipe for factoring an n-bit integer with only $$2n + 3$$ quantum bits. For an RSA modulus 2048 bits long, Beauregard’s circuit would need $$2·2048 + 3 = 4,099$$ logical qubits. This was a dramatic reduction from earlier, more naive constructions and quickly became a reference point. In theory, just over four thousand well-behaved qubits, running Shor’s algorithm, could find the prime factors of a 2048-bit number – a task that would take classical supercomputers longer than the age of the universe.
Crucially, Beauregard’s 4,099-qubit figure refers to logical qubits in an ideal quantum computer, essentially assuming qubits that are perfectly stable and error-free. In 2003, and even today, such qubits are purely hypothetical. Beauregard’s result was nonetheless significant: it distilled Shor’s algorithm down to its qubit minimum, showing that quantum factoring didn’t require extravagantly more memory than the size of the number being factored. The paper brought Shor’s factoring algorithm closer to implementation in principle. Ever since, “a few thousand qubits” has been the oft-quoted yardstick for the quantum threat to RSA.
Yet Beauregard’s minimalist approach achieved its low qubit count at the cost of a very large circuit depth (the number of sequential operations). His design requires a series of quantum modular multiplications, additions, and quantum Fourier transforms executed mostly in sequence. The gate count scales on the order of n³ (with some logarithmic factors) – meaning billions of quantum operations for $$n = 2048$$. In a noise-free world, one could carry out those operations given enough time. But real quantum hardware is highly error-prone and qubits can’t maintain quantum states indefinitely. As a result, researchers soon started asking: could we trade more qubits for a shorter runtime to make Shor’s algorithm easier to execute in practice?
Trading Qubits for Speed: Space–Time Optimizations
In the years after 2003, a series of research efforts explored different balances between the number of qubits and the time needed to run Shor’s algorithm. The motivation was simple: quantum error grows with circuit depth, because qubits gradually lose their delicate quantum states (a phenomenon known as decoherence) and each gate operation has some probability of error. If using a few extra qubits can cut the circuit depth significantly, the algorithm might finish before noise ruins the computation. The longer it takes to execute a circuit, the more noise will accumulate in the system. The goal was to find a “sweet spot” where the quantum resources (qubits and time) are both within plausible future technology limits.
One early example came in 2014, when Pavlidis and Gizopoulos proposed a design requiring $$9n + 2$$ qubits (far more than Beauregard’s $$2n+3$$) in order to reduce the circuit depth. For RSA-2048, their approach would use about 18,434 logical qubits instead of 4,099, but the added workspace allowed certain operations to run in parallel, shortening the overall runtime. This exemplified the trade-off: they sacrificed qubit economy to mitigate decoherence and errors. Around the same time, other researchers like Häner et al. (2017) showed variations of Shor’s circuit using roughly $$2n+2$$ qubits (again on the order of 4,000 for 2048-bit keys) with optimized quantum adders and multipliers. The qubit count stayed near the thousands, but incremental improvements in gate efficiency continued to chip away at the huge constant factors in the runtime.
A major breakthrough in space–time optimization came in 2019–2021 with a paper by Craig Gidney and Martin Ekerå. Tapping into a toolbox of techniques – phase estimation tweaks, modular multiplication tricks, and more – Gidney and Ekerå engineered a circuit for Shor’s algorithm that massively reduced the total operations without letting qubit requirements explode beyond reach. Their construction uses on the order of 3n logical qubits (for an n-bit RSA key). In concrete terms, for $$n = 2048$$, their design calls for roughly 6,150 logical qubits ($$3×2048$$ with a small additive term). This is about 50% more qubits than Beauregard’s bare-bones approach, but the payoff is a drastically shorter circuit. They report needing on the order of $$0.3,n^3$$ quantum gates (approximately $$2.6\times10^9$$ operations for n=2048), which is almost two orders of magnitude fewer operations than some prior estimates in the literature. In terms of overall “space-time volume” (the product of the number of qubits and the number of time-steps), their approach is 100× more efficient than earlier schemes.
To appreciate the improvement: not long ago, in 2012, Fowler et al. estimated that breaking RSA-2048 might require on the order of 1 billion noisy qubits with their approach. Gidney and Ekerå’s 2021 result brought that down to around 20 million physical qubits (we’ll explain this “physical” detail shortly) and only 8 hours of runtime. In other words, if one could magically build the quantum computer they outline, it could factor a 2048-bit RSA number in essentially a workday. The catch, of course, is that such a machine would be enormous by today’s standards – but it’s a lot less enormous than previous estimates. Gidney and Ekerå explicitly consider practical constraints like a surface-code error-corrected architecture, communication delays, and the need for repeated quantum error-correction cycles. Their headline scenario (“How to factor 2048-bit RSA in 8 hours using 20 million noisy qubits”) assumes about 20,000 logical qubits (encoded in those 20 million physical qubits) operating in parallel to pull off the computation within hours. If one were willing to run the computation longer – say over a few days – the number of qubits needed could be dialed down (perhaps into the tens of millions of physical qubits, rather than 20 million). Likewise, if one could somehow build an even larger quantum computer (hundreds of millions of qubits), the RSA break could be sped up to minutes. This flexibility reflects a general principle: Shor’s algorithm’s runtime is inversely proportional to the number of qubits available. Fewer qubits means a longer computation; more qubits can dramatically accelerate it – up to a point.
Importantly, none of these advances changed the ballpark qubit requirement by orders of magnitude. Whether it’s 4,099 qubits, 6,150 qubits, or even 20,000 qubits, we are still talking about quantum resources in the thousands of logical qubits for factoring RSA-2048. The optimizations were about making those thousands of qubits do their work faster and more reliably, not eliminating the need for them. As Gidney and Ekerå summarized, their construction (like others) still requires on the order of n or a few n’s worth of qubits. In fact, for many years researchers suspected that a true asymptotic improvement (significantly below linear in n) might be impossible without some fundamentally new algorithmic insight. The best-known theoretical lower bound came from Christof Zalka in 2006, who sketched how one might reduce the qubit count to around 1.5n, though a full explicit circuit wasn’t given. That would be roughly 3072 qubits for RSA-2048 – a bit lower, but still “thousands” in the same sense. For a long time, no one came close to that lower bound in a concrete design. Then, in 2024, a new proposal broke the mold by aiming to dramatically cut qubit usage – at the cost of other resources exploding.
Compressing the Circuit: Fewer Qubits at a (High) Price
In early 2024, Clémence Chevignard and colleagues published a paper titled “Reducing the Number of Qubits in Quantum Factoring.” Their focus was exactly as it sounds: minimize the qubit count needed for Shor’s algorithm, even if it means an impractically long runtime. In a sense, this is the opposite priority of efforts like Pavlidis’s or Gidney’s – it’s a throwback to squeezing qubits above all else, now pushed to an extreme. Chevignard et al. succeeded in proving something striking: it’s possible to factor an n-bit RSA modulus with roughly $$n/2$$ logical qubits (plus some smaller overhead). For a 2048-bit RSA number, their design needs only about 1,730 logical qubits – less than half of Beauregard’s 4,099 and well below even Zalka’s 1.5n conjecture. This result is important as a proof of concept that Shor’s algorithm can be made far more qubit-frugal than previously thought. However, before we declare RSA-2048 imminently breakable with a thousand-qubit machine, we must examine the catch: Chevignard’s approach achieves this qubit compression by offloading work elsewhere, resulting in massive costs in time and repetitions.
The Chevignard et al. algorithm uses a series of clever techniques (from number representation tricks to leveraging an earlier Shor variant by Ekerå–Håstad) to perform the modular exponentiation – the heart of Shor’s algorithm – in an almost serial fashion. In simple terms, they reuse a small set of “worker” qubits over and over for pieces of the computation, instead of doing many operations in parallel on a big array of qubits. This reuse means the total number of qubits can be very low at any given moment. But it also means the algorithm’s circuit depth skyrockets. The authors report a gate count on the order of $$n^3$$ (which for n=2048 is on the order of $$2^{36}$$, tens of trillions of operations) for just one run of the algorithm. What’s more, their method doesn’t always succeed in one shot – it has a certain probability of finding the correct answer, and if it fails, you have to run it again. On average about 40 runs are required to get the factors. All told, the expected number of logical operations is in the ballpark of $$40 \times 2^{36}$$ gates, a truly astronomical number. Even if each logical operation took only, say, 1 microsecond (which is optimistic given error-corrected quantum gate speeds), the total time would be on the order of $$2^{36}$$ microseconds per run times 40 runs – which works out to roughly $$2×10^{13}$$ μs, or about 230 days of runtime. In practice it could be much longer, and the error rates would accumulate without relentless error correction. In short, the Chevignard scheme shows that yes, you can drop below 2,000 qubits to factor RSA-2048 – but “you pay elsewhere,” in this case by waiting vastly longer and demanding even more from your qubits’ stability.
The takeaway from this compressed approach is not that we’ve found a cheap way to break RSA, but rather a confirmation of the no-free-lunch principle in quantum algorithms. Every approach so far that reduces the logical qubit count below a few thousand does so by increasing the time or repetitions required to an impractical extent. Conversely, approaches that reduce the time (like Gidney & Ekerå’s) do so by using more qubits. As one expert succinctly summarized, before Chevignard et al. the best one could do was about 1.5n qubits, and now they’ve achieved ~0.5n qubits – but “concretely, they estimate 1730 logical qubits to factor a 2048-bit semiprime,” with large overhead, whereas earlier methods needed ~4099 qubits with far fewer repetitions. In all cases, the order of magnitude remains in the thousands of logical qubits. The 4,000-qubit ballpark has stood firm; you can push below it only by making other parts of the problem (like the number of steps) correspondingly harder.
From Logical Qubits to Physical Reality
Thus far we’ve been talking about logical qubits – perfect, error-corrected quantum bits that behave exactly as the math of Shor’s algorithm assumes. The distinction between logical and physical qubits is crucial for understanding how far away we are from actually factoring a 2048-bit RSA key. A physical qubit is a qubit in real hardware (for example, a superconducting circuit or a trapped-ion state). Physical qubits suffer errors from environmental noise, imperfect control, and other decoherence effects. A logical qubit, by contrast, is a stable qubit encoded across many physical qubits using quantum error-correcting codes. The leading approach to quantum error correction is the surface code, which assembles a single logical qubit out of a 2D patch of dozens or hundreds of physical qubits arranged in a lattice. The surface code can detect and correct a certain number of errors as they occur, up to a limit determined by the code’s distance (roughly speaking, the redundancy level). The higher the required reliability, the larger the code distance and thus the more physical qubits needed per logical qubit.
In practical terms, the hardware overhead is enormous. Many resource estimates assume a physical gate error rate around $10^{-3}$ (which is optimistic but within reach of today’s best qubit devices) and a target logical success probability high enough to run billions of operations. Under those conditions, a single logical qubit might require on the order of 1,000 or more physical qubits in a surface code array. Gidney & Ekerå, for instance, modeled their 8-hour RSA-breaker with a surface-code architecture and found it would need on the order of 20 million physical qubits in total – about 20,000 logical qubits times roughly 1,000 physical qubits per logical. Similarly, Beauregard’s notional 4,099 logical qubits would translate to several million physical qubits if you actually built the error-correcting fabric for them. These numbers assume operations are parallelized optimally. If one instead tried to use the Chevignard approach with 1,730 logical qubits, you would still need millions of physical qubits (for those 1,730 logical) and you’d have to maintain coherent error-corrected operation over a much longer time. In some ways that’s an even tougher ask – keeping qubits stable and in sync for months of computation, rather than for hours, dramatically raises the required error correction performance. The bottom line: 4,000 logical qubits is not the same as 4,000 actual qubits. It’s more like 4,000 “qubit packets,” each containing maybe a thousand physical qubits, plus additional overhead for error management.
Today’s quantum computers are nowhere near this scale. We have neither the number of qubits needed – 4,099 or 1,730, nor the quality of qubits in current hardware. The largest quantum processors have just over a thousand physical qubits, and none of them can yet run Shor’s algorithm for anything but tiny demonstration problems. In fact, the largest number ever factored with Shor’s algorithm on real quantum hardware is $$21 = 3×7$$. (And even that achievement used various assists from classical post-processing to overcome noise.) The reason is straightforward: present devices have high error rates (often 0.1%–1% per gate operation ) and very short coherence times (quantum states lasting microseconds to milliseconds at best). Shor’s algorithm for a large number requires a huge sequence of operations, far beyond what can be executed within those fragile timeframes. To break RSA-2048, a quantum computer would need to be fault-tolerant – continually correcting errors as it computes. This demands implementing those logical qubits with error correction, which is exactly what inflates the qubit counts into the millions. As of 2025, experimentalists have only just begun to demonstrate tiny logical qubits (a single encoded qubit, sometimes with an error rate better than the raw device but still error-prone). We effectively have 0 logical qubits capable of long calculations available today. Going from 0 to 4,000 is a staggering leap that will likely take many years of research and engineering.
It’s also worth noting that all these resource estimates – 4,000 qubits, 6,000 qubits, 1,700 qubits, 8 hours, 40 runs, etc. – come with assumptions that could turn out to be optimistic or require further refinements. They are typically ballpark figures to guide the cryptographic community, not exact blueprints of the future. Change the error rate by an order of magnitude, or impose an I/O bottleneck in the quantum architecture, and the required qubit counts might shift up or down by factors. Gidney & Ekerå cautioned that doubling or halving the physical gate error rate in their model would change the number of qubits required by more than 10×. So, 4,099 shouldn’t be seen as a magically precise threshold; it’s a reference point under certain idealized conditions. The real threshold for breaking RSA-2048 will depend on the quality of qubits, the efficiency of error correction, and the exact optimizations used in the quantum implementation. It might be that fewer than 4,099 logical qubits could suffice if our physical qubits are extremely good – or conversely, we might need far more logical qubits if our error rates remain high.
Implications: How Close Are We to the Quantum RSA Threat?
For a cybersecurity audience, the natural question is: When will we actually have to worry about a quantum computer factoring RSA-2048? The good news is that, given the state of technology today, RSA-2048 remains unbreakable in practice and should for some years to come. Even the most aggressive quantum resource estimate (such as the 6,150-logical-qubit circuit) requires multiple orders of magnitude more qubits than exist in any form today, along with error correction and runtime stability that have yet to be demonstrated. In other words, RSA-2048 is safe – for now. But the caveat is that the field of quantum computing is advancing on many fronts. The past decade brought the required qubit estimates down from billions to millions to (theoretically) thousands. The next decade will bring bigger and better quantum processors. It’s impossible to predict the exact timeline: it could be 5 years or it could be 15 years before a quantum computer of this caliber exists. Some experts lean toward the longer end, considering the daunting engineering challenge. Others, including me, warn that unforeseen breakthroughs – new error-correction methods, or even a different quantum factoring algorithm – could accelerate the schedule. Technological progress isn’t linear, and breakthroughs can break through established predictions.
What is clear is that the 4,099 logical qubits figure should not be misinterpreted as “only 4,099 qubits and we’re done.” It’s not a trivial ask at all; it represents an immense quantum computing capability. Those qubits would have to be sufficiently error-protected to run a billions-step computation, which effectively means an enterprise on the scale of a major national or corporate project to build a fault-tolerant quantum computer. When that day comes – when thousands of logical qubits are online – then indeed RSA-2048 will not last long. The machine could factor the RSA key used in a typical secure web connection perhaps in a matter of hours. All the theoretical work points to that conclusion: once enough stable qubits are available, Shor’s algorithm has no known cryptographic countermeasure. This is why efforts like NIST’s post-quantum cryptography project are underway, racing to standardize new encryption algorithms that could resist quantum attacks. The wise approach for cybersecurity professionals is to anticipate the quantum threat well before it materializes, because the lead time to transition global cryptographic infrastructure is on the order of a decade.
The main insight from the research so far is that no one has found a miraculous shortcut around the need for thousands of high-quality qubits. Breaking RSA-2048 will require a large, error-corrected quantum computer. Whether that’s 1,000, 4,000, or 20,000 logical qubits, it’s still a huge leap from where we are. Every proposal that reduces the qubit count has exacted a price in other resources like time or success probability. Every proposal that reduces the runtime has required throwing in more qubits. Shor’s algorithm has proven stubbornly resource-hungry: even as the estimates get refined, we’re always talking about machines on a scale well beyond today’s quantum labs. So while 4,099 qubits may be the famous number in headlines, the real message for cybersecurity is the gap between theory and deployable hardware. Theoretically, just a few thousand perfect qubits could crack RSA-2048 overnight – practically, achieving those “perfect qubits” is a monumental task that no one has completed yet.
In summary, the 4,099 logical qubits figure comes from serious science (Beauregard’s circuit and its kin) and remains a meaningful milestone in quantum cryptanalysis. It signifies the approximate minimum memory a quantum computer needs to run Shor’s algorithm for a 2048-bit key. Subsequent research has confirmed this scale while demonstrating you can trade between qubit quantity and computation time – but not evade the overall difficulty. The implication is that we will likely see RSA-2048 broken only when quantum computers reach a certain maturity: on the order of millions of physical qubits working in concert, backed by outstanding error correction to maintain thousands of logical qubits through trillions of operations. No such machine is available today, and building one will be one of the hardest technological feats in history. But the trajectory of research suggests it is not a matter of “if,” but “when.” That’s why the smart money is on migrating to quantum-safe encryption well before we hit that 4,099-qubit threshold in reality. By the time someone actually builds a quantum computer of that caliber, we want RSA-2048 to have already been retired to the history books.