The 1,000-Qubit Ceiling That Probably Isn’t
Table of Contents
21 Mar 2026 – A curious thing happened in cybersecurity Slack channels and LinkedIn threads over the last few days: security professionals who had been dragging their feet on post-quantum cryptography (PQC) migration suddenly had a new reason to wait. A paper published in the Proceedings of the National Academy of Sciences by Tim Palmer, an emeritus professor of physics at Oxford, proposed that quantum computers might face a hard, physics-imposed ceiling of roughly 1,000 useful qubits — far too few, Palmer argued, to ever break RSA-2048.
For CISOs, the headlines seemed to have taken one massive worry off their plate: Phys.org wrote “Quantum Computers Could Have a Fundamental Limit After All“. “RSA May Not Be Threatened After All,” echoed The Quantum Insider. Within days, the paper was being cited in enterprise risk meetings as evidence that the quantum threat had been overstated – that perhaps the billions being spent on cryptographic migration were premature.
There is just one problem: the paper doesn’t prove what many people think it proves. It proposes an unverified theoretical framework, builds on a gravitational collapse model whose simplest version has already been experimentally ruled out, and even if its central claim were correct, the “ceiling” it describes may not be high enough to protect the cryptographic algorithms organizations rely on today.
What Palmer Actually Claims
If you are, like many of my readers, a pur cyber professional, feel free to skip the next two chapters.
To understand why this paper has generated so much attention, and so much misinterpretation, it helps to dig a bit deeper into the physics.
Palmer’s paper introduces a framework he calls Rational Quantum Mechanics, or RaQM. Palmer is a Fellow of the Royal Society and best known for pioneering work in chaos theory and ensemble weather forecasting – a background that informs his approach, which draws heavily on fractal geometry, p-adic numbers, and dynamical systems theory. The core idea of RaQM is philosophically elegant: standard quantum mechanics treats the state space of quantum systems – the mathematical structure called Hilbert space – as continuous and infinite. A qubit’s state can be described by parameters that take any real or complex value, including irrational numbers that would require infinite information to specify precisely. Palmer proposes that this continuum is an idealization. In his framework, the parameters describing quantum states are restricted to rational numbers – fractions that can be expressed with a finite amount of information.
This restriction has a cascading consequence. In standard quantum mechanics, every qubit you add to a system doubles the number of dimensions in Hilbert space. An N-qubit system lives in a space with $$2^{(N+1)} − 2$$ degrees of freedom – a number that grows exponentially. But in RaQM, the total information available to describe the system grows only linearly: N qubits times L bits per qubit, where L is a very large but finite number that Palmer calls the “granularity parameter.”
When the exponential dimensionality of Hilbert space outstrips the linear information budget, there simply aren’t enough bits to assign even one bit of information to every degree of freedom. The point where this happens defines N_max – the maximum number of qubits that can sustain full quantum behavior. Beyond that threshold, algorithms like Shor’s, which require entanglement spread maximally across the entire Hilbert space, lose their exponential advantage.
Palmer estimates N_max at roughly 200 for quantum dot qubits, around 300 for photonic qubits, and approximately 400 for ion trap qubits. He then pushes the estimate to an absolute physical ceiling: even under the most extreme conceivable conditions – a photon with the lowest frequency that fits within the age of the universe, superposed over a distance of one Planck length – N_max cannot exceed 1,000.
It is worth pausing to note how that absolute ceiling of 1,000 is derived. It does not come from measuring any actual quantum system. It comes from a thought experiment involving a photon whose frequency is the inverse of the age of the universe, placed into superposition over a distance of one Planck length – conditions so extreme they are purely hypothetical. This is a constructed extremum, not an empirically derived constant of nature.
The implication Palmer draws is nonetheless stark: since factoring a 2,048-bit RSA integer requires more than 2,048 perfect qubits in the circuit he references, and since N_max will never exceed 1,000, RSA-2048 is fundamentally safe. Not just safe from today’s hardware limitations – safe from physics itself.
The Assumptions Underneath the Assumptions
Palmer’s framework is internally coherent, but it rests on a stack of assumptions, each of which would represent a major departure from established physics if confirmed. This is not a measured refinement of quantum mechanics; it is, as Palmer himself acknowledges, a proposal for an entirely new theory.
The first and most fundamental assumption is that Hilbert space is discrete rather than continuous. This is a legitimate line of inquiry – John Wheeler’s famous “It from Bit” conjecture and Gerard ‘t Hooft’s cellular automaton models have explored similar territory. But no experiment has ever detected such discretization. Standard quantum mechanics, with its continuous Hilbert space, has passed every experimental test thrown at it over nearly a century. Palmer’s paper is explicit that RaQM and standard quantum mechanics are “experimentally indistinguishable for small numbers of qubits,” which means there is currently no empirical evidence for or against the theory.
The second major assumption concerns the physical mechanism behind the discretization. Palmer proposes that gravity is responsible – that the weakness of gravitational interaction is what makes the discretization so fine-grained as to be undetectable at small scales. To put numbers on this idea, he adopts the Diósi-Penrose model of gravitationally induced quantum state collapse, which provides a formula for calculating collapse timescales based on the gravitational self-energy of a quantum superposition.
This is where the story becomes particularly complicated. The Diósi-Penrose model is a serious proposal – Roger Penrose is one of the most distinguished physicists alive, and Lajos Diósi’s work on gravitational decoherence has been influential. But the model’s simplest, parameter-free version was experimentally ruled out in 2021 by a team at the Gran Sasso underground laboratory in Italy. The experiment searched for the faint radiation that the Diósi-Penrose collapse mechanism would produce in a germanium crystal. The expected radiation was not detected, setting a lower bound on the model’s free parameter that excludes the natural version of the theory. More recent analyses have further constrained the parameter space, finding that not all macroscopic systems collapse effectively even within the remaining allowed range.
Palmer explicitly addresses the Gran Sasso result in his paper, noting that RaQM does not invoke the same stochastic collapse mechanism and produces no local energy dissipation. This is a defensible response – but it means RaQM is not simply applying the Diósi-Penrose model off the shelf. It is using the model’s timescale formula while rejecting the physical mechanism that makes the model experimentally testable. The numerical estimates for L and N_max therefore depend on a model whose quantitative predictions have been partially falsified, applied in a context the model was not designed for.
The third layer of assumption involves extending the Diósi-Penrose framework – developed for Newtonian gravity and massive particles – to photonic qubits by substituting relativistic mass-energy for rest mass. Palmer acknowledges this extension explicitly, noting that “the Diósi-Penrose model was developed within a Newtonian gravitational framework” and that he is assuming “it will continue to hold in a post-Newtonian framework.” This is not unreasonable as a starting point for estimation, but it is a significant theoretical leap that introduces additional uncertainty into the already uncertain N_max values.
The PNAS Publication: What It Does and Doesn’t Mean
The fact that this paper appeared in PNAS, one of the most prestigious scientific journals in the world, has been widely cited as evidence of the theory’s credibility. This deserves some context.
The paper was reviewed by four distinguished physicists – Ivette Fuentes, Nicolas Gisin, Lucien Hardy, and Stephen Hsu – all of whom have made important contributions to quantum foundations. Their willingness to review the paper and PNAS’s decision to publish it reflects a judgment that the ideas are worth serious consideration and discussion. It does not constitute an endorsement of the paper’s conclusions.
PNAS published the paper as a “contributed” article, a track available to members of the National Academy of Sciences (Palmer is a Fellow of the Royal Society, which affords similar access). The reviewers assessed whether the paper was scientifically sound enough to merit publication and discussion. But publication in PNAS is not the same as experimental confirmation, and the paper itself is clear that its predictions are “testable in less than 5 years,” implying that they have not yet been tested.
The Number Everyone Is Getting Wrong
The media coverage of this paper has consistently framed the claim as “quantum computers are limited to 1,000 qubits.” This is misleading in several important ways.
First, Palmer’s N_max refers to perfect qubits participating in maximal superposition and entanglement across the full Hilbert space. This is a critical qualifier that gets lost in translation. Palmer is not claiming that quantum hardware vendors cannot manufacture chips with more than 1,000 physical qubits – they already have. He is claiming that algorithms requiring entanglement spread maximally across all $$2^{(N+1)} − 2$$ dimensions of Hilbert space will lose their exponential advantage beyond N_max. But real quantum computers use error correction, which encodes logical qubits across many physical qubits operating in carefully managed subspaces. Palmer’s paper acknowledges this complication but does not fully resolve whether his information-capacity limit applies to the logical qubits of an error-corrected system, to the physical qubits, or to some more subtle interaction between the two. This ambiguity is not a minor detail – it is the entire question for practical quantum computing.
Second, and more critically for cryptographic security assessments, the claim that “RSA-2048 requires more than 2,048 qubits and therefore cannot be broken” relies on a specific, unoptimized circuit from a 2024 survey paper that implements Shor’s algorithm using N qubits to factor an (N-1)-bit semiprime. This is not the state of the art. The resource landscape for quantum factoring has been transformed by a decade of algorithmic optimization, and the number of logical qubits required has been dropping at a pace that should give anyone citing the 1,000-qubit ceiling serious pause.
The Algorithmic Bulldozer: Why the “Ceiling” Is Already Uncomfortably Close
Consider the trajectory. In 2019, Craig Gidney and Martin Ekerå estimated that factoring RSA-2048 would require approximately 20 million noisy physical qubits. By May 2025, Gidney had reduced that estimate to under one million noisy qubits – a 20× improvement achieved through approximate residue arithmetic, yoked surface codes, and magic state cultivation. That design uses approximately 1,399 logical qubits.
Read that number again: 1,399 logical qubits. Palmer’s most generous estimate for ion trap qubits puts N_max at approximately 400. But his absolute upper bound – the “will never exceed” figure that accounts for any conceivable future technology – is 1,000. The current best algorithm for breaking RSA-2048 requires 1,399 logical qubits. The gap between Palmer’s ceiling and the real-world requirement has already narrowed to less than a factor of two.
And the requirement keeps falling. In February 2026, Iceberg Quantum published the Pinnacle Architecture, which uses quantum low-density parity-check (qLDPC) codes instead of surface codes. Under standard hardware assumptions, the architecture claims to factor RSA-2048 with fewer than 100,000 physical qubits. As my analysis of the Pinnacle paper noted, the design shifts the engineering difficulty from qubit count to connectivity, decoding speed, and sustained operation time – but it does so while further compressing the logical qubit requirement.
The pattern here is unmistakable. Every two to three years, the estimated resource requirements for quantum factoring drop by roughly an order of magnitude. This is driven not by hardware improvements but by pure algorithmic and architectural ingenuity – approximate arithmetic, better error correction codes, more efficient circuit compilation. Palmer’s paper treats the qubit count required for Shor’s algorithm as essentially fixed, referencing an unoptimized circuit. The actual research community has been systematically demolishing those numbers.
If Gidney’s 2025 estimate of 1,399 logical qubits is reduced by even 30% through future algorithmic improvements – an entirely plausible prospect given the trajectory – the requirement falls below Palmer’s absolute ceiling of 1,000. At that point, even if RaQM were entirely correct, it would offer no protection to RSA-2048.
The Threat Nobody Mentions: ECC Is Already Brushing Against the Ceiling
The public discussion of Palmer’s paper has focused almost exclusively on RSA. This is understandable – RSA-2048 is the standard benchmark for quantum cryptanalytic threat assessments. But it is also misleading, because elliptic curve cryptography (ECC) is far more vulnerable to quantum attack, and the latest resource estimates are already approaching Palmer’s own proposed ceiling.
As I have analyzed in detail, adapting Shor’s algorithm for elliptic curve discrete logarithm problems involves more complex arithmetic per bit, but the key sizes are dramatically smaller. ECC-256 — the workhorse of modern TLS, digital signatures, and cryptocurrency wallets — operates on 256-bit keys rather than RSA’s 2,048-bit keys.
The numbers here are striking. A February 2026 paper by Chevignard, Fouque, and Schrottenloher — accepted at EUROCRYPT 2026, one of the premier venues in cryptography — presents the most space-efficient polynomial-time algorithm for the elliptic curve discrete logarithm problem to date. Their estimate for breaking P-256: 1,193 logical qubits. That is achieved at the cost of a substantially higher gate count ($$2^38$$ Toffoli gates across 22 independent runs), but the qubit count – the dimension that Palmer’s ceiling constrains – is already within shouting distance of 1,000.
Let that sink in. Palmer’s absolute upper bound, derived from cosmological extremes, is 1,000. The current best estimate for breaking the most widely deployed elliptic curve in the world is 1,098. The gap is less than 10%. And this is the result of a first-generation optimization effort for ECC – the same team’s earlier work on RSA factoring (CRYPTO 2025) achieved similar space reductions that Gidney then leveraged and improved further. If the ECC-specific optimization trajectory follows the same pattern, sub-1,000-qubit attacks on P-256 are not a distant prospect; they are a plausible near-term development.
A separate ePrint 2026/106 paper takes a complementary approach, focusing on circuit depth optimization for ECDLP and demonstrating that P-521 can be attacked well within NIST’s MAXDEPTH constraint. The research community is now optimizing ECC attacks on multiple fronts simultaneously — qubit count, gate count, and circuit depth – and the results are converging toward resources that would fit comfortably within even a constrained quantum computer.
The asymmetry matters profoundly for risk planning. Organizations that depend on ECC – which, in practice, means virtually every organization that uses TLS 1.3, ECDSA signatures, or ECDH key exchange – cannot take comfort from a theoretical ceiling on qubit count. Even if Palmer’s ceiling were real, ECC may already be within its reach.
Why This Should Not Delay Your Migration
The most dangerous consequence of Palmer’s paper is not the paper itself – it is the way it is being weaponized in boardrooms as an argument against urgency. Let me be direct: using an unverified theoretical framework as a reason to delay PQC migration is reckless risk management.
If you need a single line for your next board meeting: Even if Palmer were entirely right, his ceiling would not protect you – because ECC attack estimates are already within 10% of his limit, RSA attack estimates are within 40%, algorithmic compression continues to drive both numbers down, and harvest-now-decrypt-later exposure is accumulating today regardless of when a CRQC arrives.
Here is what we know with certainty. Standard quantum mechanics has been confirmed by every experiment conducted over the past century. RaQM has been confirmed by no experiments. The Diósi-Penrose collapse model, on which Palmer’s numerical estimates depend, has had its parameter-free version experimentally falsified. The remaining allowed parameter space is constrained and contested.
Here is what the historical record tells us about quantum resource estimates. In 2012, estimates for breaking RSA-2048 stood at roughly one billion physical qubits. By 2019, that dropped to 20 million. By 2025, it was under one million. By early 2026, one architecture claims under 100,000. At no point in this trajectory has the estimate moved upward. The trend is consistently, relentlessly, downward.
Here is what prudent risk management requires. NIST has recommended deprecating quantum-vulnerable cryptographic systems after 2030 and disallowing them after 2035. The NSA’s CNSA 2.0 guidance requires migration to post-quantum algorithms by 2033 for national security systems. These timelines were not set based on Palmer’s framework – they were set based on the assessed trajectory of quantum hardware and algorithmic development, combined with the recognition that cryptographic migrations take years to complete.
The “harvest now, decrypt later” (HNDL) threat compounds all of this. Even if a cryptanalytically relevant quantum computer (CRQC) does not arrive until 2035 or 2040, adversaries are already capturing encrypted traffic today for future decryption. Data with long confidentiality requirements — medical records, intelligence, trade secrets, financial transactions — is at risk now, regardless of when a CRQC materializes. Palmer’s theoretical framework, even if correct, does nothing to address this threat, because the data is being harvested under the assumption that quantum decryption will eventually become feasible.
What Organizations Should Do Now
Palmer’s paper is intellectually stimulating and may ultimately contribute to our understanding of quantum foundations. It should not influence your cryptographic migration timeline. Here is what should:
Complete your cryptographic inventory if you haven’t already, with particular focus on ECC deployments in TLS 1.2/1.3, code signing, and key exchange. ECC is more vulnerable to quantum attack than RSA, and it is vastly more pervasive in modern infrastructure.
Prioritize systems with long data retention requirements. Any data that must remain confidential beyond 2030 should be treated as exposed to HNDL risk today. This includes healthcare records, financial transactions, government communications, and intellectual property.
Begin testing ML-KEM (formerly CRYSTALS-Kyber) for key encapsulation and ML-DSA (formerly CRYSTALS-Dilithium) for digital signatures. NIST finalized these standards in 2024, and major implementations are already available in OpenSSL, BoringSSL, and most enterprise TLS stacks.
Monitor the algorithmic landscape, not just the hardware landscape. The most significant developments in quantum threat timelines over the past three years have come from algorithmic improvements, not hardware milestones. A breakthrough in quantum error correction or circuit optimization can shift threat timelines by years, overnight, with no advance warning.
Do not anchor your risk models to any single theoretical framework — whether optimistic or pessimistic. Plan for a range of scenarios, and ensure your migration timeline can accommodate the possibility that quantum threats materialize earlier than current median estimates suggest.
The Real Significance of Palmer’s Paper
Strip away the media hype and the boardroom appropriation, and Palmer’s paper does something genuinely valuable: it reminds us that the foundations of quantum mechanics are not settled. The question of whether Hilbert space is truly continuous or secretly discrete is profound, and it connects to some of the deepest unsolved problems in physics – the unification of quantum mechanics and gravity, the nature of information, the meaning of measurement.
If RaQM is eventually confirmed by experiment – Palmer suggests this could be testable within five years using quantum computers themselves – it would be one of the most important discoveries in the history of physics. It would reshape not just quantum computing but our fundamental understanding of reality. That is a remarkable scientific claim, and it deserves serious investigation.
But “deserves serious investigation” and “should govern your 2026 cybersecurity budget” are very different statements. The former is for physicists. The latter is for CISOs – and for CISOs, the evidence overwhelmingly points in one direction: migrate now, while the algorithms are standardized, the implementations are maturing, and you still have time to do it without panic.
The 1,000-qubit ceiling may or may not exist. The quantum threat to your cryptographic infrastructure does not depend on the answer.
Quantum Upside & Quantum Risk - Handled
My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.