Q-Day Revisited – RSA-2048 Broken by 2030: Detailed Analysis
Table of Contents
Introduction
It’s time to mark a controversial date on the calendar: 2030 is the year RSA-2048 will be broken by a quantum computer. That’s my bold prediction, and I don’t make it lightly. In cybersecurity circles, the countdown to “Q-Day” or Y2Q (the day a cryptographically relevant quantum computer cracks our public-key encryption) has been a topic of intense debate. Lately, the noise has become deafening: some doom-and-gloom reports insist the quantum cryptopocalypse is just a year or two away, or is already here in secret government labs, while hardened skeptics claim it’s so distant as to never happen. The truth lies between these extremes. A sober analysis of the latest breakthroughs shows that Q-Day is not here yet and won’t happen tomorrow – but it’s also no longer on the hazy horizon of “maybe never.” In fact, recent advances have dramatically sharpened the timeline, bringing the fall of RSA into the plausible timeframe of around 2030.
As someone who’s been tracking quantum computing progress and making public Q-Day predictions for over 15 years, I’ve consistently argued that it’s not enough to watch the raw count of qubits in labs. We must also scrutinize improvements in quantum error correction, algorithm design, and the number of logical (error-corrected) qubits needed for an attack. Every few years I’ve updated my forecast accordingly. (Last one here: “Q-Day Predictions: Anticipating the Arrival of CRQC“). For years I held to an estimate of 2032 for Q-Day – but a string of major developments in just the last few days has compelled me to move that prediction forward.
Three pieces of recent news triggered this reassessment, each hitting a different “axis” of quantum progress: algorithmic efficiency, hardware error rates, and engineering roadmaps.
Here’s the quick-and-dirty takeaway for the impatient and those that follow quantum computing news:
- IBM’s latest roadmap (unveiled June 2025) targets a fault-tolerant system with ≈200 logical qubits by 2029 and clearly spells out a path to ≈1,000+ logical qubits by the early 2030s.
- Gidney’s May 2025 paper shows that ~1,000–1,400 logical qubits, running for about a week, are enough to factor RSA-2048 when paired with modern error-correction tricks.
- Oxford’s new fidelity record (single-qubit gate error ≈ 1 × 10-7) hints that each logical qubit could soon be built from hundreds, not thousands, of physical qubits.
Put those three facts together and my old 2032 Q-Day estimate is essentially baked in even if the field made zero further scientific breakthroughs after June 2025 and simply executed on what has just been published.
Realistically, though, we’ll keep squeezing qubit overhead with better error-correction codes, smarter factoring circuits, and faster hardware integration. That steady drumbeat of incremental wins makes 2030 the likelier arrival date, and slipping past 2032 now feels like the long shot, not the baseline.
I unpack all the supporting data, caveats, and counter-arguments in the full article.
As I summarized, in just the past few weeks, researchers have slashed (physical) qubit requirements for factoring RSA-2048 from millions to under one million, demonstrated quantum gate fidelities at or beyond the threshold needed for effective error correction, and laid hardware roadmaps for large-scale fault-tolerant quantum computers by the end of this decade. In short, the pieces needed to factor a 2048-bit RSA key are rapidly falling into place. While this doesn’t mean an overnight collapse of cryptography, it does mean governments and industry must urgently recalibrate their post-quantum migration plans. The three recent breakthroughs I analyzed in separate posts:
- First, a new factoring algorithm published by Google researchers slashed the qubit count needed to factor RSA-2048 by an order of magnitude which I analyzed here: “Quantum Breakthrough Slashes Qubit Needs for RSA-2048 Factoring.”
- Second, physicists at Oxford achieved a record-breaking low error rate in quantum operations (only 1 error in 6.7 million), foreshadowing much lower overhead for error correction. I summarize their paper here: “Oxford Achieves 10⁻⁷-Level Qubit Gate Error, Shattering Quantum Fidelity Records.”
- And third, IBM unveiled a detailed roadmap promising a fault-tolerant quantum computer by 2029 – years ahead of many expectations. I analyzed this announcement here: “IBM’s Roadmap to Large-Scale Fault-Tolerant Quantum Computing (FTQC) by 2029 – News & Analysis.”
These advances, in algorithmics, error correction, and scalable hardware, all point to the same conclusion: the timeline to a cryptoanalytically (or cryptographically) relevant quantum computer (CRQC) is accelerating.
The punchline? If current trends hold, a quantum computer capable of breaking RSA-2048 will likely exist by around 2030. That doesn’t mean internet encryption collapses overnight or that we should all panic. But it does mean the prudent window for migrating to quantum-safe cryptography is right now. The latest science has shifted Q-Day from an “if” to a concrete question of “when,” and the smart bet is “sooner than previously thought.” Let’s explore why.
Three Key Recent Advancements and Their Implications
Shor’s Algorithm and the Road to Fewer Qubits
The quantum threat to RSA encryption originates with Shor’s algorithm, discovered in 1994, which theoretically allows a large quantum computer to factor integers (and thus break RSA) exponentially faster than any classical method. In simple terms, Shor’s algorithm uses a quantum routine to find the secret period of a modular arithmetic function, from which the prime factors of an RSA modulus can be deduced. The catch: implementing Shor’s algorithm for a 2048-bit number has always looked prohibitively demanding in terms of quantum resources. Early estimates were downright astronomical. Around 2012, researchers estimated that factoring a 2048-bit RSA key might require on the order of 109 physical qubits under then-known techniques. That figure – a billion qubits – put Q-Day safely beyond any near-term horizon. Even a few years ago, in 2019, more refined analysis (by Craig Gidney and Martin Ekerå) brought the requirement down but still pegged it at roughly 20 million physical qubits to factor RSA-2048 in about 8 hours. These numbers seemed fantastical when real devices had only tens of qubits. It’s no wonder many experts felt RSA-2048 would remain secure well into the 2030s or 2040s.
But here’s the thing about cryptographic attacks: given enough brilliant minds and time, they almost always get better. Over the past decade, and especially over the last year, a succession of algorithmic breakthroughs has steadily chipped away at Shor’s hefty resource demands. A key focus has been optimizing the quantum circuits for modular exponentiation – the heart of Shor’s algorithm that multiplies numbers repeatedly modulo N (the RSA modulus). The first big breakthrough came from mathematician Stéphane Beauregard in 2003, who showed that you could factor an n-bit number using roughly 2n + 3 logical qubits. In principle, that’s only ≈4,100 logical qubits for n = 2048. Beauregard’s approach cleverly re-used quantum registers to cut down qubit count (in contrast to doing everything in parallel which would need ~3n qubits). The trade-off was time: his circuit had enormous depth (scaling on the order of n³ operations) which made it vulnerable to errors. Still, it demonstrated that in theory Shor’s algorithm wasn’t outrageously qubit-hungry – it could run with thousands of qubits, not trillions – if you didn’t mind running it very slowly. Back then, however, even thousands of error-corrected qubits implied millions of physical qubits, once error correction overhead was accounted for. So Beauregard’s result was academically interesting but didn’t change the bottom line that a CRQC was far out of reach.
Fast-forward to 2019, and we saw a major step toward practicality. Gidney and Ekerå revisited Shor’s algorithm with a bag of modern optimization tricks. They combined improved arithmetic circuits (like better adders and multipliers), qubit recycling strategies, and finely tuned space-time tradeoffs. The result: RSA-2048 could be factored with about 6,000 logical qubits in an 8-hour quantum computation. Under reasonable error-correction assumptions, that translated to roughly 20 million physical qubits. While 20 million is still huge, this was a 50× improvement over some prior estimates. Gidney & Ekerå’s paper became a landmark, often cited as the “state of the art” target for a cryptographically relevant quantum computer. It suggested that if you could build a machine with tens of millions of qubits and keep it running for a day, you could crack RSA-2048. In other words, the quantum “wall” defending RSA was high, but perhaps not unclimbable in a multi-decade timeframe.
The relentless drive to reduce the qubit count didn’t stop there. By 2021-2022, researchers were exploring even more radical ways to trade off quantum space (qubits) for time (number of operations). A breakthrough came in late 2024 when Chevignard, Fouque, and Schrottenloher introduced an algorithm using an Approximate Number System for modular exponentiation. Without diving into heavy detail: they found a way to perform the modular multiplications “piecewise,” using a tiny quantum register that handles a few bits of the number at a time, rather than holding the entire 2048-bit number in quantum memory. By recycling a small set of qubits over and over for each chunk of the calculation (and cleverly tolerating some approximation error that can later be corrected), they slashed the logical qubit requirement dramatically – down to roughly 1,730 logical qubits for RSA-2048. That’s about half the qubits of the Gidney-Ekerå approach. The catch? Their method required many more sequential steps. In fact, it would take on the order of 2³⁶ quantum operations (roughly 70 billion) repeated about 40 times. This could mean running the quantum computer non-stop for months for a single factorization – an eternity in quantum-coherence terms. So while the ~1,000 logical qubit ballpark was astonishingly low, the algorithm was impractically slow given foreseeable error-corrected clock speeds. It was a proof of concept that you could factor with very few qubits, if you didn’t mind waiting a long time and using massive error correction to survive the journey.
Now, here’s where 2025 changed the game. Craig Gidney’s new paper (May 2025) essentially took the best of both worlds: the low qubit footprint of the approximate method and the manageable runtime of the 2019 approach. His recipe, to paraphrase the title, shows “How to factor 2048-bit RSA with less than a million noisy qubits.” In concrete terms, Gidney demonstrated that a fully error-corrected quantum computer could factor a 2048-bit RSA key in under one week with 1,399 logical qubits including ancilla, magic-state, and “idle” qubits (from Gidney 2025 Resource table in pre-print) encoded into <1 million physical qubits. This is roughly a 20× reduction in qubit count from the 2019 estimate, at the cost of a 20× increase in runtime (8 hours vs a few days). At the heart of Gidney’s 2025 method are clever space–time optimizations and error-correction-aware techniques. First, it employs approximate arithmetic to dramatically reduce the size of quantum registers needed for Shor’s algorithm. By breaking the huge 2048-bit modular exponentiation into smaller pieces and tolerating tiny calculation errors (that can be corrected later), the algorithm avoids the old “one qubit per bit of the number” rule. This qubit recycling strategy, combined with “yoked” high-density storage of idle qubits and new ways to generate low-error magic states, cuts the logical qubit requirements to the bone. Gidney also managed to reduce the total gate count (operations) by over 100× compared to other low-qubit methods, ensuring the runtime doesn’t blow up even as qubit count drops. (For more info on how did Gidney pull off a million-qubit RSA breaker see my summary “Quantum Breakthrough Slashes Qubit Needs for RSA-2048 Factoring.”)
Crucially, Gidney’s design stays within the realm of plausibility: it assumes physical qubits with error rates around 0.1% and operation speeds ~1 MHz, which is roughly what today’s best qubits achieve in labs. In other words, he’s not postulating some magical new qubit technology – the gains come from better algorithms and error correction techniques, not fantasy hardware. The bottom line: factoring RSA-2048 now appears technically feasible with roughly a million physical qubits executing for a few days. Just a few years ago, experts viewed this as utterly impractical.
In sum, quantum algorithm researchers have orchestrated a remarkable resource collapse: from one billion to 20 million to now one million qubits needed to break RSA-2048. Each leap was powered by ingenuity in circuit design and error correction rather than just waiting for hardware to improve. The trajectory is unmistakable – the “quantum qubit barrier” for breaking RSA has been dropping by orders of magnitude roughly every 5–6 years. And importantly, none of these developments violate known physics or require sci-fi tech; they’re working within the constraints of what near-future quantum hardware is expected to achieve. Gidney’s latest work even explicitly aligns with NIST’s conservative planning timeline for quantum risk, noting that migrating to post-quantum cryptography by the early 2030s is vital since attacks only get better. This rapid algorithmic progress is one big reason I’ve pulled in my Q-Day estimate. When the “number of qubits required” goes from impossible to merely daunting, one can’t be complacent.
Being Careful About the Hype
Before moving on, let’s address one outlier: the much-hyped claim in late 2022 that RSA-2048 might be breakable with just 372 qubits. This came from a Chinese research group who factored a 48-bit number with a hybrid quantum-classical approach and extrapolated that 372 high-quality qubits might crack 2048-bit RSA. The news sparked both excitement and exaggerated headlines (“RSA broken!”) in early 2023. However, in the following months the wider scientific community found that claim highly doubtful. The method relied on heuristic lattice algorithms and variational quantum routines (QAOA) that don’t scale cleanly, and independent experts could not replicate the results for anything larger than tiny toy problems. In short, the 372-qubit claim turned out to be more hype than reality. (I wrote more about the original paper and the recent update here “Breaking RSA Encryption: Quantum Hype Meets Reality (2022–2025).”) It’s a cautionary tale: extraordinary assertions (especially those bypassing Shor’s algorithm entirely) require extraordinary proof, which so far hasn’t materialized. As of 2025, Shor’s algorithm and its optimized descendants remain the only established path to factoring RSA-2048 on a quantum computer.
Breaking the Error Barrier: Record Fidelity and Fault-Tolerance
So far we’ve focused on qubit counts and algorithms, but an equally crucial piece of the Q-Day puzzle is quantum error correction (QEC). No matter how clever your algorithm is, if your qubits are too noisy, you’ll never complete the computation. That’s why every theoretical qubit estimate (like “1 million qubits needed”) is implicitly talking about physical qubits that are error-corrected to act as a much smaller number of logical qubits. The overhead cost of error correction has traditionally been the biggest barrier to realizing a CRQC. It’s like a tax on quantum computation: if each logical qubit needs 1,000 physical qubits to stay error-free, suddenly your 1,399-logical-qubit algorithm demands ~1.4 million physical qubits. But what if you could lower that tax? Recent advances suggest we can – by improving the quality of individual qubits.
In June 2025, a team at Oxford University announced a milestone in qubit fidelity that truly turned heads. They demonstrated single-qubit gate operations with an error rate below 10⁻⁷ – that is, only one error in ten million operations. In terms of fidelity, that’s 99.99999% accuracy, the highest ever recorded for any quantum hardware. For perspective, until now the best qubits (in ion traps or superconducting circuits) topped out around 99.9%–99.99% (10⁻³ to 10⁻⁴ error) for single-qubit gates. Hitting the 10⁻⁷ error scale is a big deal – it’s like leaping from a car that occasionally hiccups to one that drives hundreds of miles without a single sputter. The Oxford group achieved this using a trapped-ion qubit (a calcium-43 ion) manipulated by ultra-stable microwave pulses instead of the usual lasers, which dramatically reduced noise and calibration errors. The ion’s coherence time stretched to an incredible 70 seconds, letting them run millions of gate operations while hardly accumulating any decoherence. In short, they built a qubit that is for practical purposes almost perfectly reliable for single-qubit flips.
Why does a 10⁻⁷ error rate matter? Because in quantum error correction, lower physical error directly translates to lower overhead for achieving a given logical accuracy. Most Quantum Error Correction (QEC) codes (like the popular surface code) have a threshold error rate around 10⁻² to 10⁻³ – meaning if your qubits are noisier than ~0.1–1%, error correction doesn’t even work. When just below threshold (say 0.1% error), you might need thousands of physical qubits per logical qubit to push the logical error rate down to, for example, 10⁻¹⁵ (good enough for a long computation). But if your physical error is 10⁻⁷, you could achieve the same logical fidelity with far fewer physical qubits – potentially on the order of only tens of qubits per logical qubit. In other words, high-fidelity qubits can shrink the size (and timeline) of a quantum computer by orders of magnitude.
To make this concrete, consider our RSA-2048 factoring scenario. Gidney’s 2019 estimate (20 million physical qubits) assumed roughly 0.1% gate errors. If each physical qubit failed only once in 10 million operations instead of 1 in 1,000, how many would we need? While a precise recalculation is complex, researchers have speculated that instead of tens of millions of physical qubits, a few hundred thousand could suffice if we had 10⁻⁷-level fidelity across the board. In fact, the Oxford team mused that if two-qubit gates (currently the weak link) could be pushed down to 10⁻⁷ errors as well, breaking RSA-2048 might only require “hundreds of thousands or even less” physical qubits. That’s a far cry from a billion. It suggests that an all-out effort to improve qubit fidelity can significantly accelerate the timeline to a CRQC by lowering the hardware requirements.
Of course, as the Oxford researchers themselves caution, they only demonstrated this fidelity on a single qubit. The hardest part – two-qubit entangling gates – still lags behind. In the best ion-trap systems and superconducting chips today, two-qubit gate errors are around 0.1% to 1% (10⁻³ to 10⁻²). That’s much worse than 10⁻⁷. The Oxford result effectively says: we’ve eliminated single-qubit errors as a concern; now all focus is on multi-qubit gates, memory errors, and readout errors. The good news is there’s progress there too – for instance, Quantinuum (Honeywell) recently demonstrated two-qubit gates at 99.9% fidelity with their ions, and a superconducting experiment at MIT achieved about 99.92% fidelity for a two-qubit gate in 2024. Crossing the “three nines” (99.9%) threshold for 2-qubit gates is huge, because many QEC codes start working effectively around that level. If we can push to four or five nines in entangling gates over the next few years, the error-correction overhead will plummet correspondingly.
The broader point is this: quantum hardware is rapidly improving in quality, not just quantity. There’s a growing “fidelity-first” movement in quantum engineering that prioritizes perfecting a qubit’s performance before scaling up in number. The rationale is simple – a few high-quality qubits can teach us how to build thousands later without an unmanageable error burden. As one observer put it, trying to scale up “faulty” qubits is a dead end; it’s better to make qubits much more reliable now so that when we scale to, say, 1,000 qubits, they behave like 1,000 “nearly perfect” qubits and require far less error-corrective overhead. The Oxford achievement is a proof-of-concept that with relentless engineering, physical qubits can approach the kind of error rates that make fault-tolerance almost easy. If that philosophy takes hold across the industry, we could see the effective timeline to a useful quantum computer shorten dramatically.
From a Q-Day perspective, the takeaway is optimistic (for the quantum builders) and ominous (for our crypto safety nets): the hardware error problem is gradually being tamed. When you hear quantum contrarians say “quantum computers will never work because of decoherence and noise,” point them to the Oxford result. Yes, it’s one qubit, but it’s a qubit that ran 10 million operations with essentially no mistakes. That’s strong evidence that noise can be beaten, at least at the single-qubit level. And as noise falls, the “effective qubits needed” to break RSA falls too. A machine with fewer but cleaner qubits might do the job just as well as one with lots of noisy qubits. This means we have to keep an eye not only on qubit counts but also on fidelity records when estimating Q-Day.
The Hardware Race: From 100 Qubits Today to 1,000,000 by 2030?
The final piece of the puzzle is scaling up the hardware. We’ve seen algorithms and error rates trending favorably, but do we have any realistic path to building a quantum computer with, say, one million physical qubits (or equivalently a few thousand logical qubits)? Until recently, that question elicited eye-rolls – today’s devices only have on the order of hundreds of qubits, and none are error-corrected. However, 2023–2025 has seen quantum industry roadmaps become markedly bolder and more concrete about reaching the million-qubit scale. The key players – IBM, Google, and others – are no longer shy about targeting that milestone within the next decade.
IBM’s 2029 Roadmap
In June 2025, IBM made waves by announcing plans to deliver a fault-tolerant quantum computer by 2029. Codenamed Quantum “Starling”, this system is slated to have about 200 logical qubits – which, IBM notes, would likely be enough to show clear advantages over classical supercomputers on certain tasks. To get there, IBM is leveraging a modular architecture. They had already achieved a 1,121-physical-qubit chip in 2023 (the “Condor” processor), hitting the limits of a single-chip design. Instead of, say, a 10,000-qubit monolithic chip (likely impossible due to I/O and cooling constraints), IBM is developing modular quantum chips that can be linked. Their 156-qubit “Heron 2” chips, introduced in 2024, are designed to connect to each other via high-fidelity interconnects. Essentially, IBM’s plan is to build a quantum computer the way we build supercomputers: multiple modules networked together, with both quantum and classical links coordinating them. By 2027, IBM aims to have multi-module systems with error correction running, and by 2029, the full Starling system operational.
In parallel, IBM has already shown small-scale fault tolerance experiments: in 2023, IBM researchers entangled two logical qubits (encoded on 133 physical qubits total) and maintained them with error correction, achieving a logical entangled state fidelity of ~94%. That was a hint that even with current chips, a few logical qubits can be realized.
IBM’s confidence is now high – as their head of Quantum, Jay Gambetta, put it: “We’ve answered the science questions… now it’s an engineering challenge”. No new physics miracles are needed, just scaling up and refining.
Google’s Quantum AI
Google has been somewhat quieter on public roadmaps, but their goals are similarly ambitious. In 2020, Google’s CEO hinted at aiming for a useful error-corrected quantum computer by the end of the decade (the “2029” timeframe) – and internally, Google’s researchers have eyed the million physical qubit threshold as well.
Google’s focus has been demonstrating the fundamentals of error correction: in 2023, they reported the first instance where a larger quantum error-correcting code outperformed a smaller code, meaning adding qubits actually reduced the error rate of a logical qubit. This was a crucial proof-of-concept that QEC works as advertised on real hardware. They used a 49-qubit grid (distance-5 surface code) and saw lower logical error than with a 17-qubit (distance-3) code, even beating the error of the best single physical qubit. With that milestone reached, Google is now trying to string together logical operations. In 2025, they have a prototype logical qubit that lasts long enough to perform multi-step algorithms, and they are aggressively experimenting with new chip designs (including dual-layer chips for routing, better materials to cut error rates, and perhaps incorporating components like photonic links or superconducting resonators to network modules).
Google has also been a leader in algorithmic research (as evidenced by Gidney’s work), which informs their hardware targets. While not formally announced, it’s believed Google is targeting on the order of 100+ logical qubits by ~2028 and then scaling to the thousands by the early 2030s. Given their partnership with NIST on post-quantum cryptography and their public emphasis that we must adhere to NIST’s 2030/2035 migration timeline, Google clearly sees the first half of the 2030s as the danger zone when quantum codebreaking becomes feasible.
Trapped Ion Platforms (Quantinuum, IonQ)
The ion trap approach trades off speed for fidelity. Companies like Quantinuum (Honeywell) and IonQ have far fewer qubits on their devices (dozens), but with impressively low error rates and full connectivity. Quantinuum demonstrated a fully error-corrected logical qubit (using the 7-qubit Steane code) as early as 2021, including real-time error correction cycles. They have since kept a logical qubit “alive” with multiple rounds of QEC and are nearing the break-even point where the logical qubit’s error rate is below that of any physical qubit – a crucial tipping point. Their roadmap suggests a small fault-tolerant computer (maybe tens of logical qubits) by late this decade, and scaling via ion trap networking after that. This involves linking many ion traps with photonic interconnects (quantum links over fiber), effectively building a distributed quantum computer.
IonQ, meanwhile, has been pursuing a different scaling method: moving ions between multiple traps and using advanced ion transport to handle more qubits. IonQ’s public goal is to hit #AQ > 1000 by 2028 (their “algorithmic qubit” metric roughly corresponds to being able to run a 1000-qubit algorithm with decent fidelity). If ions or atom-based systems can crack a few hundred qubits with full error correction by ~2030, they too could threaten RSA soon after – because, recall, with extremely high fidelity, you might not need a million physical qubits; a few hundred logical could do.
These companies haven’t made splashy “2030 we break RSA” claims, but they consistently cite the mid-2030s as when they expect full-blown fault-tolerant machines to be operational.
Photonic Quantum Computers (PsiQuantum)
Perhaps the most bullish of all is PsiQuantum, a well-funded startup that insists a million-qubit photonic quantum computer is the only way to achieve useful quantum computing. They’ve been aiming straight for that goal from the start, with a timeline reportedly targeting around 2027–2028 for having a fundamental large-scale system in place. Photonic qubits (single photons) don’t suffer from decoherence in the same way as matter-based qubits, and they can be routed in fiber optics, making it conceptually easier to build a huge network of them. PsiQuantum’s plan involves a room-sized machine with silicon photonic chips interconnected by thousands of fiber cables, generating and manipulating photons to create entangled cluster states for computation. They are working with global partners to fabricate the necessary photonic chips using conventional semiconductor fabs. The upside is potential scalability (the hardware is closer to telecom/datacom equipment than delicate ion traps or superconductors). The downside is that photon gates are probabilistic and require a lot of overhead (like many optical switches and detectors to manage losses). Still, PsiQuantum claims no fundamental breakthroughs are needed, just engineering: “no physics left to be solved, only scale-up” is their refrain, similar to IBM’s tone.
If they indeed field a million-physical-qubit (photonic) system by 2028, even if each logical qubit requires thousands of photons, they could reach triple-digit logical qubits relatively soon. One of PsiQuantum’s founders even suggested that IBM’s 2029 superconducting FTQC will face enormous cryogenic and wiring challenges, whereas a photonic machine could be built in a more data-center-like environment. Time will tell – but it’s worth noting that multiple approaches are racing neck-and-neck to the same finish line.
Wild Cards
There are other qubit modalities and efforts that shouldn’t be overlooked. Microsoft’s pursuit of topological qubits (based on Majorana zero modes) is high-risk, high-reward: if they succeed, each qubit would be inherently more stable, potentially cutting the error correction cost dramatically. After years of struggle, Microsoft reported some progress in 2022-2023 with evidence of the required Majorana states, but a functional qubit isn’t there yet. If by chance they crack it in the next couple of years, that could accelerate Q-Day by making quantum computers much more compact.
Likewise, companies like Intel (silicon spin qubits) and university labs working on neutral atom arrays are pushing on scalability. Intel hopes its chip-fab expertise can produce dense arrays of tiny spin qubits and control them with on-chip electronics, avoiding the wiring bottleneck of superconductors. They’ve got 49-qubit test chips and talk of 1000+ qubit arrays later this decade.
Neutral atoms (like those from Pasqal or ColdQuanta) can naturally trap hundreds of atoms – e.g., Pasqal showed >1000 atom sites in 3D – though controlling them with low error is the challenge. Any of these could have a breakthrough that suddenly boosts qubit counts or reliability beyond the current curve.
All told, the hardware timeline has dramatically firmed up. What used to be tentative “maybe 15-20 years” statements have turned into concrete promises like “by 2029, we will have X.” Of course, roadmaps can slip – engineering is hard. But the alignment of so many major players around the late-2020s/early-2030s for achieving large-scale quantum computers is significant. They are effectively betting billions of dollars that this can be done within ~5 years. This lends credence to the idea that Q-Day, which requires such a machine, is likely in that same timeframe. Indeed, as media coverage noted when Gidney’s 2025 factoring result came out, multiple corporate and government roadmaps now point to around 2030 for million-qubit processors. If those machines materialize on schedule, running Shor’s algorithm against 2048-bit RSA “in about a week” becomes a realistic capability. Even if the hardware arrives a little late, say by 2032 or 2033, that’s still within the planning horizon of today’s cybersecurity roadmaps.
One more consideration: national security programs. Everything above is based on publicly available info from companies and academic labs. But given what’s at stake, it’s plausible (indeed likely) that some nation-state projects are quietly pushing toward a CRQC on a similar timeline. Government-funded quantum efforts in the U.S., EU, and China are massive – China especially has poured substantial resources, though the details are opaque. If one of these programs made a breakthrough (either in hardware or algorithms), they might keep it classified, at least for a while, to exploit the advantage. This “unknown unknown” adds a layer of uncertainty. It’s conceivable that Q-Day could even arrive sooner than the consensus timeline if a secret project hits paydirt. That’s speculative, but it’s a risk scenario serious enough that, for example, the U.S. NSA has been warning about “harvest now, decrypt later” tactics for years.
From Weeks of Computation to Megawatts of Power: The Realities of an RSA-Breaking Quantum Computer
While I’ll keep repeating that we should start preparing for Q-Day now, I also don’t want to cause panic. Somewhat good news for cybersecurity professionals is that even once a cryptographically relevant quantum computer (CRQC) exists, breaking RSA-2048 won’t be trivial in practice – it will likely be an expensive, specialized endeavor. Gidney’s design, for instance, would consume on the order of a week of runtime on a million-qubit machine. Each 2048-bit number factored might require billions of quantum gate operations executed in sequence. In today’s terms, that’s a massive computation – by comparison, current noisy devices struggle to maintain state beyond a few hundred gates. So while a future CRQC could crack a single RSA key in, say, 3–7 days, it won’t be cracking thousands of keys on a whim without significant upgrades in throughput.
Furthermore, the energy cost of such a feat will be enormous. Large-scale quantum computers (especially superconducting ones) demand power-hungry cryogenics and control systems. In my previous post “The Enormous Energy Cost of Breaking RSA‑2048 with Quantum Computers” I tried to summarize why factoring a single RSA-2048 key could cost tens of thousands of dollars in electricity alone. Even with hardware improvements, we are likely talking tens of megawatt-hours of energy and many thousands of dollars in electricity per RSA key broken. (By contrast, classical supercomputers, while also power-hungry, would need billions of years to do the same task, so we’ve traded impossible time for heavy energy.) The takeaway is that early quantum attacks will be the domain of nation-states or elite organizations – those who can allocate dedicated facilities and power budgets to target the “crown jewels” of encrypted data. As I previously wrote “not just about qubits and math; it’s about megawatts” as well.
From Forecast to Reality: Why I’m Predicting 2030 for Q-Day
Bringing together all these threads, let’s answer the key question: When will RSA-2048 actually be broken by a quantum computer? My updated prediction, based on the evidence discussed, is 2030 – give or take a year. This is more aggressive than many past estimates, so let me clarify the reasoning and also what this implies for action.
First, consider the trendline. As the Google team noted, the resource estimates for quantum factoring have been dropping by roughly an order of magnitude every few years. In 2012: 10⁹ qubits. In 2019: 2×10⁷ qubits. In 2025: 10⁶ qubits. Meanwhile, the largest quantum hardware has grown from ~50 qubits in 2017 to ~1000 qubits in 2023, and possibly modules of thousands by 2026. If we extrapolate this interplay for another few cycles, we converge on an intersection: by around the end of this decade, we might have machines with of order 10⁵–10⁶ physical qubits (thanks to modular scaling), and the algorithmic requirements might have shrunk to the same. IBM essentially confirmed this convergence by stating their 2029 machine (200 logical qubits) should be at the cusp of quantum advantage and presumably could be expanded to cryptanalysis tasks. My estimate of “Q-Day ~2030” assumes that no extraordinary roadblocks emerge in scaling quantum systems. It also assumes no further miraculous algorithmic leaps that would pull the date even closer (2030 is already quite soon given today’s state, but I’m comfortable with it given the rate of progress).
How does this align with others’ views? Surveys of experts have typically given a wide range: low probability in the 2020s, rising in the 2030s, and majority confidence by 2035. In other words, the consensus has been “sometime in the 2030s” with uncertainty on the exact year. My 2030 call is on the early edge of that consensus – arguably a bit more urgent. Why? Because the developments of the last two years (2023–2025), and especially last few weeks (no poll covered experts since), have all skewed towards sooner rather than later: breakthroughs in reducing qubit needs, improving error rates, and accelerating investment have tightened the timeline. If you asked me in 2020, I’d have said maybe 2035–2040. By 2022, seeing the pace, I moved to 2032. Now with the evidence at hand, I’m comfortable saying 2030 is a realistic target barring unforeseen slowdowns. And keep in mind, this isn’t a guarantee – it’s about risk. I’d characterize it like this: there is a non-trivial (say 30-50%) chance of a CRQC by 2030, rising to near certainty by 2035. That’s enough to act on, especially given the stakes.
It’s worth mentioning that Q-Day might not announce itself with fanfare. The first time RSA-2048 is factored by a quantum computer could very well happen in a classified lab or a clandestine project, with results kept secret. Alternatively, it might be a public demonstration by a company or research team to prove a point (much like the 1990s demonstrations of breaking 56-bit DES encryption – done to convince the world it was breakable). If it’s the latter, we might get a heads-up like “Today, researchers have factored RSA-2048 using a quantum computer, in a calculation that took X days on Y qubits.” If it’s the former (e.g., a nation-state doing it quietly), we may not know Q-Day has arrived until much later (when perhaps leaked or when encrypted data starts getting mysteriously decrypted). The prudent course, of course, is to assume Q-Day could effectively come as soon as the technology is capable of it, whether or not it’s widely announced.
To be clear, when I say “RSA-2048 will be broken by 2030,” I am referring to a demonstration of factoring a 2048-bit number with a quantum computer. This is the canonical definition of Q-Day because RSA-2048 is a standard benchmark for public-key cryptography strength. But a CRQC would equally threaten other cryptosystems of similar or lesser strength: for instance, 256-bit elliptic curve (like the curves securing Bitcoin and many HTTPS connections) would also fall to Shor’s algorithm with roughly the same order of effort. In fact, breaking a 256-bit EC key is a bit easier than 2048-bit RSA in theory (fewer qubits needed), so an RSA-breaking quantum computer certainly breaks ECC too. Diffie-Hellman key exchange, DSA, and any finite-field or elliptic curve system – all would be vulnerable. So Q-Day is not just about RSA; it’s about the collapse of essentially all traditional public-key crypto. Symmetric crypto (AES, etc.) is less affected – Grover’s quantum algorithm can speed up brute force attacks, but doubling key sizes mitigates that. The real catastrophe is in our asymmetric crypto infrastructure. Thus, when I say RSA-2048 will be broken, it’s shorthand for “our current public-key algorithms (RSA/ECC/DH) will no longer be safe.”
Now, some might argue: even if a million-qubit machine exists by 2030, running it to factor a large number might still be an arduous task – maybe it’ll take weeks or months of runtime, maybe only nation-states will have the capability, etc. All true. But from a defensive standpoint, that’s irrelevant. If a quantum computer can break your encryption given enough time, the encryption is effectively broken. Also, the technology curves only improve from there – what takes a nation-state weeks in 2030 could take a university days in 2035 and a script kiddie minutes by 2040, to exaggerate only slightly. The key point for decision-makers is that the risk becomes real the moment someone, somewhere, can do it at all. And that moment is approaching rapidly.
Here it’s useful to invoke Mosca’s rule, coined by Dr. Michele Mosca: if X = the years you need your data secure, Y = the years to deploy new crypto, and Z = the years before quantum breaks your crypto, then if X + Y > Z, you’re in trouble. For many organizations, X (data confidentiality requirement) might be 5–10 years or more, and Y (upgrade time) is also several years. So if there’s even a decent chance Z (Q-Day) is less than ~10-15 years out, you have a serious risk to address now. With 2030 as a credible target (just ~5 years from now for first capability, ~10 years for broader availability), X+Y for most should be greater than Z – meaning we’re already in the danger zone. Indeed, NSA and NIST effectively acknowledged this: NIST’s guidance is to begin migrating immediately and have quantum-vulnerable crypto deprecated by 2030. The U.S. government’s national security systems are mandated to switch to post-quantum algorithms in the next few years (with NSA setting deadlines in the mid-2020s for starting the transition). These timelines weren’t picked arbitrarily; they were based on risk assessments that essentially assume a CRQC might exist by the early-to-mid 2030s. With recent advances, those assumptions look even more valid, if not conservative.
Let’s also address the outliers: the doomsayers and the nay-sayers. On one end, you have sensational claims that Q-Day is “next year” or “already here in secret.” As of now, there’s no credible evidence that anyone has a quantum computer powerful enough to threaten RSA-2048 in 2025 or 2026. We still struggle to keep just a few logical qubits alive. So, no, the sky isn’t falling in 2025 or 2026 – don’t let vendor marketing or hyperbolic media convince you otherwise. On the other end, you have respected cryptographers or physicists who remain deeply skeptical, saying things like “quantum computers will never be scalable” or “we won’t see this for many decades, if ever.” I think the progress detailed in this article is a strong rebuttal to the extreme skeptics. We’ve seen too many “impossible” milestones reached in the last few years to claim it’ll never happen. The conversation has shifted from “if” to “when” – even the cautious experts concede it’s a matter of time (with second half of 2030s as an outside guess). In my view, clinging to “never” is wishful thinking that could leave you badly exposed if wrong. History of technology is full of examples where breakthroughs came sooner than anticipated once a field hit an exponential growth phase – and quantum computing appears to be at the cusp of such a phase right now.
So 2030 it is. Perhaps I’ll be off by a couple of years – nobody can pinpoint the exact year with certainty. But as someone responsible for protecting data, you have to plan for the worst plausible case consistent with evidence. And the evidence now says the worst plausible case is only a handful of years away, not decades.
Conclusion
If there’s one message to take away, it’s this: the quantum threat to cryptography is no longer a distant abstraction; it’s a tangible and approaching reality. Whether Q-Day arrives in 2028, 2030, or 2033, the difference is marginal – all are soon enough that we must prepare today. From the discussion above, a few key points stand out for security professionals and policymakers:
- The Quantum Attack Trajectory is Shortening: In 2012, breaking RSA-2048 needed a billion qubits – effectively impossible. By 2019, 20 million qubits. Now it’s around one million qubits and a few days of runtime. This trend of improved algorithms and error correction is likely to continue. We can’t bank on RSA’s safety by saying “quantum computers need too many qubits” – that number keeps dropping. Assume that what looks infeasible now will become feasible sooner than expected; recent breakthroughs are proof of that dynamic.
- Advances in Hardware are Accelerating: Real experiments have demonstrated core requirements of a CRQC: logical qubits with error correction, 99.99999% fidelity gates, and multi-chip quantum processors in the works. Industry leaders and government programs are pouring resources into scaling up. Multiple credible roadmaps target the early 2030s for large-scale quantum machines capable of cryptanalysis. This is not science fiction – it’s the explicit goal of IBM, Google, and others, with progress milestones being hit each year.
- PQC Transition is Urgent and Unavoidable: If anyone still doubted whether to invest in post-quantum cryptography migration, these developments should erase that doubt. We now have NIST-standardized PQC algorithms (like CRYSTALS-Kyber, Dilithium, etc.), and major tech firms have begun implementing them (Google, Cloudflare, AWS, etc., are testing PQC in protocols and services). NIST’s recommended timeline is to start phasing out vulnerable crypto by 2030, and completely eliminate it by 2035. That timeline wasn’t picked casually – it aligns with when a quantum threat becomes not just possible but probable. Given the lead time required to transition systems (which can be 5-10 years for large enterprises or government agencies), starting now is the only viable strategy. Every year of delay increases the risk of being caught by Q-Day before you’ve finished upgrading. Remember, cryptographic agility (the ability to swap out algorithms) is part of resilience. If you haven’t inventoried where you use RSA/ECC and developed a migration plan, you’re already behind.
- “Store Now, Decrypt Later” is a Real Threat: Data that is encrypted today can be recorded by adversaries and kept until they have a quantum computer to decrypt it. This especially affects sensitive data with long confidentiality needs – think national security intelligence, healthcare records, confidential business plans, personal data protected by privacy laws, etc. If such data has a shelf life of more than ~5-10 years, assume that anything encrypted with RSA/ECC today might be readable by the 2030s. The only defense is to either stop using vulnerable encryption now for long-term data, or if that’s not possible, shorten the lifetime of your secrets (e.g., enforce secure deletion or rotation so the data doesn’t exist by the time a quantum attack could happen).
- Don’t Panic, but Do Prepare (Starting Yesterday): The goal of highlighting Q-Day is not to incite fear but to promote action. We have solutions (PQC algorithms), and we have time if we use it wisely. The transition will be complex – some PQC algorithms have larger keys or signatures, meaning performance and compatibility issues need to be worked through. There will be new implementation bugs to watch for, and possibly further rounds of standardization (especially for digital signatures, where current PQC options are less mature). But these are solvable engineering challenges, and they are far preferable to the nightmare of waking up one day to find adversaries can trivially break all your encrypted traffic and stored communications. The recent breakthroughs serve as an exclamation point on earlier warnings: the clock is ticking.
To sum up, my updated Q-Day prediction of 2030 is not a prophecy set in stone, but a rational analysis of where current trends are heading. Whether I’m off by a couple years either way doesn’t change the core advice. We are roughly five years out from the first potential quantum disruptions to cryptography, and about ten years out from them becoming widespread. This is within the horizon of strategic IT planning. It means every organization’s 5-year roadmap should include quantum readiness. We’re essentially in the final countdown to Y2Q – akin to the final stretch before Y2K, except this “millennium bug” for encryption doesn’t have a fixed date and won’t announce itself in advance. The prudent course is to act as if Q-Day will hit in the early 2030s, because the cost of being prepared a little early is far lower than the cost of being even one day late.