Industry

Project Eleven Publishes 110-Page Quantum Threat Report for Blockchains — Rigorous Analysis Amid a Season of Denial

May 6, 2026 – Project Eleven, the post-quantum cryptography startup led by CEO Alex Pruden and CTO Conor Deegan, today published “The Quantum Threat to Blockchains — 2026 Report,” a 110-page analysis covering the full landscape of quantum risk to digital assets. The report spans quantum hardware modalities, error correction, resource estimation trends, chain-specific vulnerability profiles, NIST post-quantum cryptography standards, and a detailed blockchain migration framework.

The timing is notable. Two days ago, I published an analysis of how pseudoscience and denial have colonized Bitcoin’s quantum security discourse — from a 222-page unreferenced paper claiming Bitcoin’s block interval disproves quantum mechanics, to a panel at Bitcoin 2026 where half the participants dismissed the threat outright. Pruden himself sat on that panel, making the case for urgency while his co-panelists called quantum computing “science fiction.”

This report is the full version of that case. A disclosure before I continue: Project Eleven asked me to review the report before publication, and the report cites my CRQC Quantum Capability Framework as a reference. Separately, Project Eleven builds and commercializes post-quantum infrastructure for blockchain systems, so they have a commercial interest in the threat being taken seriously. With both of those on the table: the analytical work here is serious, well-sourced, and in several places genuinely original. For anyone who sat through the Bitcoin 2026 panels and wondered what the science actually says when stripped of ideology, this is a good place to start.

My Analysis

What the Report Gets Right

The report’s strongest contribution is its structured framework for thinking about quantum computing progress. It organizes the path to a cryptographically relevant quantum computer (CRQC) into a four-layer stack: physics (qubit quality), error correction (physical-to-logical qubit ratio), system integration (decoder, feed-forward, stability), and algorithm demand (the requirements Shor’s algorithm imposes). This is a useful simplification. Readers familiar with my work will recognize the structure as conceptually aligned with my CRQC Quantum Capability Framework, which the report cites directly. The key insight both frameworks share: progress toward a CRQC is not measured by physical qubit count alone. It is measured across multiple capability dimensions that interact multiplicatively.

The resource estimation analysis is particularly well done. The report traces the dramatic collapse in estimated qubit requirements: from Gidney and Ekerå’s 20 million physical qubits for RSA-2048 in 2021 to Gidney’s under one million in 2025, then to Pinnacle’s 100,000 with qLDPC codes, and finally to the two March 2026 papers that changed the conversation: Google/Babbush et al.’s estimate of fewer than 500,000 superconducting physical qubits for secp256k1 ECDLP, with full runtimes of roughly 18–23 minutes and a primed on-spend window of roughly 9–12 minutes; and Cain/Oratomic’s neutral-atom proposal, which claims Shor-scale computation with as few as 10,000 reconfigurable atomic qubits and P-256 discrete-log runtimes of a few days at roughly 26,000 physical qubits. The report correctly identifies that these two architectures represent different threat profiles: fast-clock superconducting, silicon, or photonic systems make on-spend attacks against active transactions plausible, while initial slow-clock neutral-atom or ion-trap systems are more naturally at-rest threats against dormant wallets and long-exposed keys. That distinction is not absolute, but it is operationally important. Both paths are credible. Both lead to the same conclusion.

The blockchain-specific vulnerability analysis is thorough. Project Eleven’s own Bitcoin Risq List puts approximately 6.9 million BTC (roughly one-third of circulating supply) in quantum-vulnerable addresses. The dominant exposure is address reuse: about 4.99 million BTC, or 72.3% of the exposed total. The largest script-type exposure is legacy P2PK, at about 1.72 million BTC (24.8%), because the public key is embedded directly in the output. Taproot P2TR accounts for about 198,000 BTC (2.9%) because the x-only public key is visible in the address; bare multisig is negligible by comparison. For Ethereum, the report relies on Deloitte’s earlier finding that over 65% of Ether was in quantum-exposed addresses — a broader exposure surface than Bitcoin’s, driven by Ethereum’s account model and by public-key recovery from signed transactions. The report compounds that account-layer exposure with two additional quantum-vulnerable primitives: Ethereum’s consensus-layer BLS12-381 signatures and KZG/EIP-4844 data availability commitments.

The stablecoin analysis deserves separate attention. The report makes a point I have not seen emphasized this clearly elsewhere: a quantum attack on a major stablecoin would not require draining individual wallets. It would target the contract’s admin keys. A compromised minting authority can create unbacked tokens. A compromised proxy admin can rewrite the contract’s logic entirely. Because major upgradeable stablecoin contracts often place high-privilege roles — proxy admin, owner, master minter, blacklister, pauser — behind small ECDSA signer sets or multisigs, a quantum attacker who can recover enough exposed signer keys could move into contract control rather than drain individual wallets. The blast radius depends on role design, signer thresholds, timelocks, and minting ceilings. But for proxy-admin or master-minter paths, the risk can approach the whole currency rather than a single user balance. That is a qualitatively different risk profile from attacking a base-layer protocol.

Where It Adds Something New

Two technical contributions stand out as genuinely original.

First, the report highlights an asymmetry in migration prospects between EdDSA and ECDSA chains that deserves wider attention. EdDSA chains (Solana, Sui, Aptos, Near, Stellar) use RFC-8032 key derivation, which computes the signing scalar from a seed via a hash function. If Shor’s algorithm recovers the signing scalar, it cannot reverse the hash to obtain the underlying seed. This structural property could enable post-quantum zero-knowledge proofs of seed ownership, allowing accounts to bind their existing address to a new PQ key without moving funds. ECDSA chains generally lack this property at the signature-scheme level. Some ECDSA wallets derive scalars from seeds using BIP-32/BIP-85-style constructions, but that does not create the same clean, protocol-wide ZK migration path: proving the relevant derivation is wallet-specific and expensive, and it does not help dormant accounts where the derivation path is unknown. The practical implication remains: EdDSA chains may have a meaningfully cleaner migration path. This distinction is sourced to a 2025 IACR ePrint by Baldimtsi et al. and merits closer examination.

Second, PQC Suite B (detailed in Appendix F) is a proposal co-authored by JP Aumasson, Conor Deegan, Alex Pruden, and Zooko Wilcox-O’Hearn to replace the internal hash functions in ML-DSA (FIPS 204) and SLH-DSA (FIPS 205) with BLAKE3. The headline performance claim is strongest for ML-DSA-B: preliminary benchmarks report up to 20% faster signing, up to 30% faster verification, and much faster message pre-hashing. For SLH-DSA-B, the result is more hardware-dependent: SHAKE is consistently slower in their tests, while BLAKE3 and SHA-2 are closer and platform-sensitive. In both cases, the attraction for blockchains is clear: hash substitution changes performance without changing key or signature sizes. One caveat matters for regulated deployments: ML-DSA-B and SLH-DSA-B are proposed variants of NIST-standardized algorithms, not FIPS 204/205-compliant algorithms unless and until NIST standardizes those hash substitutions. Whether NIST engages with this proposal remains to be seen.

Where I’d Push Back

The report’s Q-Day model (Appendix E) is transparent about its methodology but should be consumed carefully. It works backwards from a fixed target logical error rate (10⁻¹⁵), calculates the required code distance, and then projects when hardware will reach that distance. The authors acknowledge that this approach inflates code distance relative to published resource estimates, forcing them to compensate with artificially high error suppression factors (5–10, compared to the ~2 observed in recent Google and Quantinuum experiments). This is a modeling choice, not an error, but it means the resulting Q-Day estimates (baseline 2033, optimistic 2030, pessimistic 2042) are less directly comparable to bottom-up, compiled-circuit resource estimates from groups like Google Quantum AI. One additional caveat: Appendix E’s demand side is anchored to RSA-2048 via Shor’s algorithm even though the blockchain threat profile is primarily ECC-256. The report itself notes that elliptic curves likely require less width and depth. That makes the 2030/2033/2042 chart useful as a transparent scenario model, not a bottom-up ECC-256 resource estimate. The sensitivity analysis is informative — physical qubit count and quality dominate, followed by error correction efficiency — but the model’s acknowledged departure from physics at the suppression factor level limits how much weight these specific year estimates should carry. My own CRQC Readiness Benchmark methodology takes a different approach that avoids this particular inflation problem.

The Timing Tells a Story

I want to return to why the timing of this report matters. Two days ago, I described three archetypes in Bitcoin’s quantum debate: the Deniers, the Grifters, and the Engineers. The Deniers produce sophisticated-sounding arguments for why the threat isn’t real. The Grifters exploit the panic or sell false solutions. The Engineers do the actual work: writing BIPs, benchmarking PQC algorithms, analyzing migration throughput, building the tools the ecosystem will need.

This report is Engineer work. One hundred and ten pages of it. The kind of work that gets drowned out when conference stages are given over to arguments that Bitcoin’s block interval disproves quantum mechanics.

Is it perfect? No. I’ve noted the disclosure above, and the Q-Day model has the methodological limitations I’ve described.

But within its scope — a blockchain-focused audience that needs to understand what quantum computing means for their assets, their protocols, and their migration timelines — this is one of the best single-document summaries I have seen. It synthesizes the key resource estimation papers (Gidney, Pinnacle, Babbush/Google, Oratomic/Cain, Chevignard), contextualizes them within a structured capability framework, applies them to specific blockchain vulnerability profiles, and produces a concrete migration framework with realistic throughput analysis. The appendices, particularly the treatment of Shor’s algorithm variants and the quantum error correction formalism, are detailed enough to serve as reference material.

The blockchain industry has a choice to make. It can listen to people with no physics credentials explain why quantum mechanics is wrong. Or it can engage with the work being done by teams that understand both the cryptography and the engineering.

The deadlines are already set, even if blockchains are not directly bound by them. NIST’s transition timeline targets deprecating and removing quantum-vulnerable public-key algorithms from its standards by 2035, with high-risk systems moving earlier. NSA’s CNSA 2.0 milestones require exclusive use of quantum-resistant algorithms for several NSS categories by 2030 or 2033. Decentralized protocols should read those dates as market-pressure and ecosystem-alignment deadlines, not legal compliance deadlines. And migration timelines for decentralized protocols are measured in years, not months. The work described in this report is the work that needs to happen. Whether the community chooses to do it is a different question — but at least they can no longer claim they weren’t given the analysis.

The full report is available as a PDF from Project Eleven.

Quantum Upside & Quantum Risk - Handled

My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.

Talk to me Contact Applied Quantum

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven consulting firm empowering organizations to seize quantum opportunities and proactively defend against quantum threats. A former quantum entrepreneur, I’ve previously served as a Fortune Global 500 CISO, CTO, Big 4 partner, and leader at Accenture and IBM. Throughout my career, I’ve specialized in managing emerging tech risks, building and leading innovation labs focused on quantum security, AI security, and cyber-kinetic risks for global corporations, governments, and defense agencies. I regularly share insights on quantum technologies and emerging-tech cybersecurity at PostQuantum.com.