How ECC Became the Easiest Quantum Target
Table of Contents
In 1985, when Victor Miller and Neal Koblitz independently proposed using elliptic curves for public-key cryptography, they solved what looked like an unassailable problem. RSA keys were getting longer – bloating handshakes, straining smartcards, slowing down every transaction that depended on them. Elliptic curve cryptography offered the same security with a fraction of the key material: 256 bits instead of 3,072. For the systems that run modern infrastructure – where milliseconds matter, where billions of devices need compact keys, where every TLS handshake is a cost center – ECC was a gift.
Four decades later, that gift is looking increasingly like a trap.
Shor’s algorithm, the quantum procedure that threatens all public-key cryptography built on number-theoretic hardness, does not care about classical security equivalences. It does not care that breaking both, P-256 and RSA-3072 would take a classical computer roughly $$2^{128}$$ operations.
Shor’s algorithm cares about one thing: the size of the numbers involved. And ECC’s numbers are small.
The result is what we might call the quantum security inversion – a scenario in which the very property that made ECC dominant in modern identity, communications, and enterprise infrastructure makes it the easier quantum target. At equivalent classical security levels, breaking P-256 requires roughly 2.6 times fewer logical qubits and 148 times fewer quantum gates than breaking RSA-3072.
The picture, however, is more nuanced than a simple “ECC is easier” claim. At equivalent classical security levels – P-256 versus RSA-3072 – the quantum advantage against ECC is dramatic and unambiguous. But at commonly deployed sizes, the comparison is messier. Thanks to aggressive recent optimization of RSA factoring circuits (particularly Gidney’s 2025 construction), breaking RSA-2048 now requires fewer logical qubits and roughly 19× fewer Toffoli gates than breaking P-256 at the Roetteler baseline. A CRQC optimized for Gidney’s algorithm could factor RSA-2048 before it could solve the P-256 ECDLP – at least until comparable optimization effort is applied to elliptic curve circuits, which has not yet happened. The “ECC falls first” thesis depends on which RSA key size you compare against, and on how much further ECC circuit optimization can go.
This is not a theoretical curiosity. P-256 ECDSA secures every passkey in circulation. X25519 ECDH protects every TLS 1.3 handshake. secp256k1 signs every Bitcoin and Ethereum transaction. The cryptographic system most deeply embedded in modern infrastructure is the one most exposed to quantum attack.
In this article I’ll try to dig into the mathematics behind that inversion – how Shor’s algorithm works differently against RSA, ECC, and Diffie-Hellman, why the resource profiles diverge so dramatically, and what the latest quantum resource estimates tell us about the practical timeline. Bottom line, though: for CISOs and security architects, the implications are immediate: your ECC migration strategy may need to come before your RSA one, not after.
TL;DLM;DR (Too Long; Don’t Like Math; Didn’t Read)
ECC is more vulnerable to quantum computers than RSA – not despite being “better” cryptography, but because of it. ECC’s classical superiority comes from using much smaller keys: 256 bits instead of RSA’s 3,072 for the same security level. But Shor’s quantum algorithm doesn’t care how clever the math is. It just cares how big the numbers are. Smaller keys = fewer qubits needed at equivalent security levels = easier to break. The problem is that ECC is everywhere: it protects every TLS 1.3 handshake, every passkey, every Bitcoin transaction, every WhatsApp message. The cryptography the world adopted because it was the most efficient will, for exactly that reason, be the first to fall.
One algorithm, three kills: the hidden subgroup framework
The most important thing to understand about Shor’s algorithm is that it is not three separate attacks. RSA, Diffie-Hellman, and elliptic curve cryptography appear mathematically distinct – factoring integers, computing discrete logarithms in finite fields, computing discrete logarithms on elliptic curves. But from a quantum perspective, all three reduce to a single structural problem.
That problem is the Hidden Subgroup Problem (HSP) over finite abelian groups, formalized in the quantum setting by Kitaev in 1995. The setup is this: you have a finite abelian group $$G$$, a hidden subgroup $$H \le G$$, and a function $$f: G \to X$$ that is constant on cosets of $$H$$ and distinct across different cosets. The quantum algorithm to find $$H$$ follows a universal template:
- Prepare a uniform superposition over all elements of $$G$$.
- Compute $$f(g)$$ coherently into an output register.
- Apply the Quantum Fourier Transform to the group register.
- Measure. The measurement outcome samples a character orthogonal to $$H$$.
- Repeat enough times. Solve the resulting system of linear relations classically to reconstruct generators of $$H$$.
For RSA, the group is $$(\mathbb{Z}/N\mathbb{Z})^*$$ – the multiplicative group of integers modulo $$N = pq$$. The function is $$f(x) = a^x \bmod N$$ for a randomly chosen $$a$$, and the hidden subgroup encodes the order of $$a$$, from which the factors of $$N$$ can be extracted classically.
For finite-field Diffie-Hellman, the group is $$\mathbb{Z}_{p-1} \times \mathbb{Z}_{p-1}$$. Given generator $$g$$ and target $$x = g^r \bmod p$$, the function $$f(a, b) = g^{a}x^{-b} \bmod p$$ hides a subgroup determined by the secret exponent $$r$$.
For ECC, the group is $$\mathbb{Z}_r \times \mathbb{Z}_r$$ where $$r$$ is the order of the base point $$P$$. Given public key $$Q = kP$$, the function $$f(a, b) = aP + bQ$$ is constant on cosets of a subgroup determined by the secret scalar $$k$$.
In every case, the Quantum Fourier Transform (QFT) extracts hidden periodicity from a quantum superposition, and classical post-processing converts that periodicity into the secret. The quantum speedup is the same. The circuit template is the same. The difference that drives the dramatically different resource profiles is the group operation circuit: modular exponentiation for RSA and finite-field DH, versus elliptic curve scalar multiplication for ECC.
This structural unity has a critical strategic implication. There is no “partial quantum threat” that breaks one algorithm family while sparing others. A CRQC capable of solving any one of these problems possesses the fundamental architectural components – sufficient logical qubits, non-Clifford gate throughput, and coherent runtime – to threaten all three. The question is purely quantitative.
Breaking RSA: period-finding in the multiplicative group
The mathematical machinery
Shor’s original 1994 algorithm converts integer factoring into order-finding. Given $$N = pq$$, choose a random integer $$a$$ coprime to $$N$$ and find the smallest positive integer $$r$$ – the order – such that: $$$a^r \equiv 1 \pmod{N}$$$
If $$r$$ is even and $$a^{r/2} \not\equiv -1 \pmod{N}$$, then $$\gcd(a^{r/2} \pm 1, N)$$ yields the factors with high probability. The quantum work is “just” order-finding – but implementing it at cryptographic scale is an enormous engineering challenge.
The quantum circuit has three principal components. Let $$n = \lceil \log_2 N \rceil$$ denote the bit-length of the RSA modulus. Shor’s original register sizing chooses $$q$$ as a power of two satisfying $$N^2 \le q < 2N^2$$, requiring a control register of $$m \approx 2n$$ qubits initialized in superposition via Hadamard gates. A modular exponentiation circuit then implements the unitary: $$$|x\rangle|1\rangle \mapsto |x\rangle|a^x \bmod N\rangle$$$
through a sequence of controlled modular multiplications. This is the computational bottleneck – it accounts for the vast majority of gates, and its cost scales roughly as $$O(n^3)$$ using standard schoolbook multiplication. Finally, the Quantum Fourier Transform (QFT) extracts the period from the control register. Shor’s explicit gate construction uses $$\ell(\ell-1)/2$$ controlled-phase gates for an exact QFT over $$2^\ell$$, plus Hadamard-like rotations and a final bit-reversal. The period $$r$$ is recovered via continued fraction expansion.
In practice, the QFT is the cheap part. The phase estimation procedure requires controlled powers $$U^{2^j}$$ of the modular exponentiation unitary, and it is these reversible arithmetic circuits – modular multiplication, modular reduction, carry propagation – that dominate the gate count. The Ekerå-Håstad variant, used in all modern estimates, reduces the control register from $$\sim 2n$$ to roughly $$n/2$$ qubits by reformulating the problem as finding short discrete logarithms rather than full periods. Combined with the May-Schlieper truncation technique, this dramatically shrinks the total qubit footprint.
The resource estimation revolution
The trajectory of RSA-2048 quantum resource estimates tells a story of relentless optimization – and serves as a warning about the danger of anchoring to any single number.
In 2021, Gidney and Ekerå published the landmark estimate in the peer-reviewed journal Quantum: approximately 20 million physical qubits and 8 hours using surface codes. This assumed superconducting hardware with a physical error rate of $$10^{-3}$$, surface-code cycle time of 1 μs, and classical reaction time of 10 μs. At the logical level, the circuit required roughly 6,189 logical qubits and 2.6 billion Toffoli gates.
In 2024, Chevignard, Fouque, and Schrottenloher (published at CRYPTO 2025) introduced approximate residue arithmetic using the Residue Number System (RNS). Instead of exact modular arithmetic requiring full-width $$n$$-bit registers, the computation uses residues modulo small coprime primes, tolerating controlled approximation errors corrected statistically over multiple runs. The key trade: logical qubits dropped to approximately 1,730, but the Toffoli count exploded – roughly $$2^{36}$$ per run across approximately 40 runs, totaling on the order of $$10^{12}$$ gates.
Craig Gidney’s May 2025 preprint from Google Quantum AI synthesized and refined these innovations into a practical construction: 1,409 logical qubits, approximately 6.5 billion Toffoli gates, 897,864 physical qubits, and a runtime of less than a week (~5 days). Three techniques enabled this: approximate residue arithmetic (optimized to reduce Chevignard’s Toffoli penalty roughly 100-fold), yoked surface codes for denser idle qubit storage, and magic state cultivation replacing conventional distillation. The assumptions remain consistent: nearest-neighbor superconducting grid, $$10^{-3}$$ error rate, 1 μs cycle time, 10 μs reaction time. From 20 million physical qubits in 2021 to under a million in 2025 – a 20× reduction in four years.
Then came the Pinnacle Architecture. Published as a preprint on February 12, 2026 by Iceberg Quantum, it replaced surface codes with quantum low-density parity check (qLDPC) codes – specifically generalized bicycle codes – while keeping Gidney’s underlying algorithm. The claim: RSA-2048 factoring with fewer than 100,000 physical qubits. The runtime-qubit tradeoff is revealing: 98,000 qubits for one month, 151,000 for one week, or 471,000 for one day. (My analysis of this paper: Pinnacle Architecture: 100,000 Qubits to Break RSA-2048, but at What Cost?)
The caveats matter. qLDPC codes require quasi-local (not nearest-neighbor) connectivity between physical qubits, which no hardware platform has demonstrated at scale. The paper is unreviewed, and Iceberg Quantum announced it alongside a $6 million seed round.
The historical compression is striking: approximately 1 billion physical qubits (Fowler et al. 2012) → 230 million (O’Gorman & Campbell 2017) → 20 million (Gidney-Ekerå 2021) → ~900,000 (Gidney 2025) → <100,000 (Pinnacle 2026). Four orders of magnitude in fourteen years, driven entirely by algorithmic and architectural innovation – not hardware improvement.
A critical nuance: these estimates use different physical assumptions across papers, and the Pinnacle figure switches from surface codes to qLDPC codes, making the trend line less clean than it appears. But the direction is unambiguous.
Breaking ECC: the expensive arithmetic on smaller numbers
Adapting Shor for the elliptic curve discrete logarithm
Given an elliptic curve $$E$$ over a prime field $$\mathbb{F}_p$$ with base point $$P$$ of order $$r$$ and public key $$Q = kP$$, recovering the secret scalar $$k$$ is the elliptic curve discrete logarithm problem (ECDLP). The quantum attack uses two control registers of $$n$$ qubits each (where $$n$$ is the bit-length of the curve’s prime field), preparing a uniform superposition: $$$\sum_{a,b} |a\rangle|b\rangle$$$
A quantum oracle computes the group operation: $$$|a\rangle|b\rangle|R\rangle \mapsto |a\rangle|b\rangle|R + aP + bQ\rangle$$$
Since $$Q = kP$$, this becomes $$f(a,b) = (a + bk)P$$, making the function constant on cosets of the hidden subgroup: $$$H = \{(a, b) \in \mathbb{Z}_r \times \mathbb{Z}_r : a + bk \equiv 0 \pmod{r}\}$$$
Inverse QFTs on both control registers, followed by measurement and classical linear algebra, reveal $$k$$. Häner et al. (2020) describe the circuit explicitly: Hadamard gates create the uniform superposition over two registers with $$n+1$$ qubits each, the controlled group operation computes $$|aP + bQ\rangle$$, and a semiclassical Fourier transform on those registers enables measurement and classical post-processing.
Why the arithmetic is harder per bit — but the numbers are smaller
Here lies the critical difference from RSA. Where RSA’s bottleneck is modular exponentiation – essentially repeated modular multiplication at $$O(n^2)$$ gates per operation – ECC requires elliptic curve point addition. Each point addition involves two field multiplications and one field inversion, giving a per-operation cost of $$O(n^2 \log n)$$ gates. The per-bit complexity of ECC arithmetic is higher than RSA’s.
But the key sizes are dramatically smaller: 256 bits for P-256 versus 2,048 bits for RSA-2048, or 3,072 bits for the classically equivalent RSA-3072. Since the total circuit scales as roughly $$O(n^3)$$ for both systems (the group operation is repeated $$O(n)$$ times), the smaller $$n$$ overwhelms the per-bit disadvantage. This is the heart of the quantum security inversion.
The circuit anatomy for ECDLP, as detailed in Roetteler et al. (2017), includes at least three logical components: the exponent workspace (two registers holding $$a$$ and $$b$$, sized at $$n+1$$ qubits each), a group-element accumulator (storing the elliptic curve point in affine coordinates plus ancillae for finite-field arithmetic including Montgomery multiplication and binary GCD inversion), and the QFT/phase-estimation mechanism (often implemented in semiclassical style to trade width against depth). Roetteler et al. established a tight qubit bound: at most $$$9n + 2\lceil\log_2 n\rceil + 10$$$
logical qubits. For $$n = 256$$ (the P-256 field size): $$9(256) + 2(8) + 10 = \textbf{2,330}$$ logical qubits.
The Toffoli count – which matters enormously because non-Clifford gates dominate both runtime and failure probability in fault-tolerant computation – was bounded at: $$$(448 \log_2 n + 4090) \cdot n^3$$$
For P-256, this yields approximately $$1.26 \times 10^{11}$$ Toffoli gates, with Toffoli depth of about $$1.16 \times 10^{11}$$.
A decade of refinement: the ECC resource landscape
The Roetteler et al. baseline has been improved significantly, though, criticall, not with the same intensity of optimization that RSA factoring has received.
Häner, Jaques, Naehrig, Roetteler, and Soeken (2020), published at PQCrypto 2020, improved the ECC arithmetic circuits through windowing techniques and a reformulated binary Euclidean algorithm for modular inversion. For P-256, they reduced logical qubits from 2,330 to approximately 2,124 and achieved a 119× improvement in T-gate count and a 54× improvement in T-depth. They also provided asymptotic scalings – T-count growing as $$436n^3 + o(n^3)$$ – and explored trade-offs between circuit width, T-count, and T-depth that become critical when translating logical costs into physical resources.
Moving from logical-level estimates to physical hardware, Ha, Lee, and Heo (2024) in Nature Scientific Reports provided the first comprehensive physical qubit estimates for ECDLP under surface codes, finding approximately 5.8 million physical qubits for P-256 at a physical error rate of $$10^{-3}$$ and algorithm failure probability of 0.01, with a cycle time of 200 ns. They explicitly confirmed that at the physical level, ECC is more vulnerable to quantum attack than RSA at comparable parameter sizes.
Gouzien, Ruiz, Le Régent, Guillaud, and Sangouard (2023), published in Physical Review Letters, took a radically different architectural approach using repetition cat qubits. Their result: a 256-bit ECDLP computed in 9 hours with 126,133 cat qubits, assuming a ratio between single- and two-photon losses of $$10^{-5}$$, a cycle time of 500 ns, and 2D nearest-neighbor connectivity. The cat qubit approach exploits exponential bit-flip suppression to reduce overhead – roughly 60× fewer qubits than an equivalent transmon/surface-code approach.
Most recently, Kim et al. (2026) on IACR ePrint claimed a 40% improvement in the qubit-count-depth product versus prior constructions, with P-224 (roughly RSA-2048 equivalent security) breakable with 6.9–19.1 million physical qubits in 34–96 minutes. The title, “Breaking Prime Elliptic Curve Cryptography in Minutes”, refers to theoretical circuit depth under optimistic physical assumptions. The paper is not yet peer-reviewed.
For secp256k1 – Bitcoin’s curve – the resource profile is essentially identical to P-256. Both are 256-bit prime field curves; the Roetteler formula depends on the field size $$n$$, not specific curve parameters. Webber et al. (2022) in AVS Quantum Science estimated secp256k1 would require 317 million physical qubits for a 1-hour attack, 13 million for a 1-day attack, or 1.9 billion for a 10-minute attack – all under surface codes with $$10^{-3}$$ error rate.
Breaking Diffie-Hellman: RSA’s mathematical twin
Finite-field Diffie-Hellman is often treated as a separate cryptosystem, but from a quantum perspective it is nearly identical to RSA. Both rely on the hardness of problems in the multiplicative group of a finite field – integer factoring for RSA, discrete logarithm for DH. Shor’s discrete logarithm algorithm solves the DH problem by computing a hidden linear constraint that Fourier sampling reveals.
Shor’s construction for discrete logarithm is explicit. Given a prime $$p$$, a generator $$g$$, and a target $$x \equiv g^r \pmod{p}$$, the algorithm uses three registers and a QFT register sized to $$q$$ satisfying $$p < q < 2p$$. It prepares superpositions over exponents $$a$$ and $$b$$ (modulo $$p-1$$), computes $$g^{a}x^{-b} \pmod{p}$$ into the third register, and applies Fourier transforms to extract the linear relationship encoding the secret exponent. In hidden-subgroup terms, the function $$f(a,b) = g^{ax-b}$$ is constant on cosets of a subgroup of $$\mathbb{Z}_{p-1} \times \mathbb{Z}_{p-1}$$ determined by the secret.
The modular arithmetic circuits – modular exponentiation, modular multiplication, carry propagation – are the same class of operations used in RSA factoring. Gidney and Ekerå (2021) addressed both problems simultaneously, and their construction applies equally. The consequence is direct: DH-2048 requires the same quantum resources as RSA-2048, and DH-3072 matches RSA-3072. Gidney’s 2025 optimizations apply identically.
ECDH – the elliptic curve variant that dominates modern deployments – follows the ECDLP cost model, not the finite-field DLP model. This distinction is critical for operational planning. ECDH with X25519 or P-256 requires quantum resources proportional to 256-bit operations, while classical DH-2048 requires resources proportional to 2,048-bit operations. ECDH is the more vulnerable target by a wide margin — and it is the variant used in TLS 1.3, WireGuard, Signal, and essentially every modern protocol.
One nuance worth flagging: ephemeral Diffie-Hellman (DHE or ECDHE) provides no protection against a real-time quantum attack. If a CRQC exists at the time of the key exchange, ephemeral keys offer no advantage over static ones. However, ECDHE does mitigate harvest-now-decrypt-later (HNDL) attacks to a degree: because each session uses fresh keys, there is no long-lived private key whose compromise retroactively decrypts all past sessions. The adversary must solve a separate ECDLP instance for each recorded session.
The numbers compared: why ECC falls first
The comparison is most revealing when made at equivalent classical security levels, which is how cryptographic systems are actually deployed. P-256 ECC and RSA-3072 both provide approximately 128-bit classical security. Here is what breaking each requires on a quantum computer:
Roetteler et al. computed both sides explicitly. P-256 requires 2,330 logical qubits and $$1.26 \times 10^{11}$$ Toffoli gates. RSA-3072 requires approximately 6,146 logical qubits and $$1.86 \times 10^{13}$$ Toffoli gates. At equivalent classical security, ECC requires 2.6× fewer qubits and 148× fewer Toffoli gates.
The Toffoli count disparity is the more significant figure. In a fault-tolerant quantum computer, every Toffoli gate requires expensive “magic state” resources – either distillation or cultivation of non-Clifford states. The 148× gap in Toffoli count translates into an enormous difference in runtime, failure probability, and total physical resource consumption.
Microsoft’s Azure Quantum Resource Estimator provides a particularly illuminating comparison at the physical level. Under “reasonable” gate-based superconducting assumptions: ECC P-256 requires 5.87 million physical qubits and 21 hours versus RSA-2048’s 25.17 million physical qubits and 1 day. Note that P-256 provides stronger classical security than RSA-2048 (128-bit versus ~112-bit), yet requires 4.3× fewer physical qubits to break quantum-mechanically.
Scott Aaronson crystallized this in his February 15, 2026 blog post responding to the Pinnacle paper: “Elliptic curve crypto is likely to fall to quantum computers a bit before RSA and Diffie-Hellman will fall, because ECC’s ‘better security’ (against classical attacks, that is) led people to use 256-bit keys rather than 2,048-bit keys, and Shor’s algorithm mostly just cares about the key size.“
This deserves unpacking. Classical security equivalences exist because the best classical attacks scale differently for each problem. For RSA, the General Number Field Sieve runs in subexponential time $$L_N[1/3, 1.923]$$. For ECDLP, the best classical attack is Pollard’s rho algorithm requiring $$O(\sqrt{r})$$ operations. Because ECDLP is classically harder per bit, the same 128-bit security level demands 256-bit ECC keys but 3,072-bit RSA keys.
Shor’s algorithm obliterates this distinction. Its complexity is polynomial in the bit-length for both problems – roughly $$O(n^3)$$ gates. Quantum resources scale with key size, not classical security level. The 12:1 key-size ratio between RSA-3072 and P-256 translates directly into a massive quantum resource gap.
There is an important nuance that deserves more than a passing mention. Gidney’s 2025 optimizations to RSA factoring have not merely narrowed the gap at common parameter sizes – they have reversed it for the most widely deployed comparison. With Gidney-optimized RSA-2048, logical qubits drop to 1,409 – actually fewer than ECC P-256’s 2,330. And RSA-2048’s Toffoli count of 6.5 billion is roughly 19× lower than P-256’s 126 billion at the Roetteler baseline. On both metrics that determine quantum feasibility – qubit count and gate count – RSA-2048 is currently the easier target.
This seeming contradiction with the “ECC falls first” thesis resolves when you account for two factors. First, the comparison is unfair: RSA-2048 provides only ~112-bit classical security versus P-256’s ~128-bit. The security-equivalent comparison is P-256 versus RSA-3072, where ECC’s advantage is unambiguous (2.6× fewer qubits, 148× fewer Toffoli gates). Second, ECDLP circuits have received far less optimization attention than RSA factoring circuits. Roetteler et al.’s 2017 construction remains the baseline for most physical estimates, while RSA has benefited from Chevignard’s RNS approach, Gidney’s approximate residue arithmetic, yoked surface codes, and magic state cultivation. If comparable effort were directed at ECDLP – and the Kim et al. (2026) preprint suggests it is beginning – the ECC figures would likely compress significantly. As a blog commenter on Aaronson’s post noted, ECC’s per-operation circuit complexity partially offsets the key-size advantage, but at equivalent security levels, key size still dominates.
The foundational result from Proos and Zalka (2003) established a ratio that has remained stable across two decades: a 160-bit elliptic curve key requires roughly 1,000 qubits to break, while the classically equivalent 1,024-bit RSA modulus requires approximately 2,000. That 2:1 qubit ratio at equivalent classical security has survived every subsequent refinement.
The bottom line is uncomfortable: a CRQC that has just barely crossed the threshold to solve 256-bit ECDLP may not yet be capable of factoring 2,048-bit RSA integers. Organizations that migrated from RSA to ECC for classical efficiency – which is nearly every modern enterprise – may have inadvertently increased their quantum exposure.
Where these algorithms live: a deployment audit
Understanding the quantum threat requires knowing where RSA, ECC, and DH actually sit in production systems. The answer is “everywhere.”
ECC: the invisible backbone of modern infrastructure
The sheer ubiquity of ECC in production infrastructure is staggering – and largely invisible to the executives responsible for managing the risk.
TLS 1.3, which now covers roughly 75% of the top websites per Qualys SSL Pulse, mandates ECDHE for key exchange. In practice, approximately 97–98% of all TLS key exchanges use ECDHE, with X25519 (Curve25519) as the default key share in Chrome and most modern clients. Every API call, every authenticated session, every mobile app connection – these are ECDHE key exchanges. Let’s Encrypt’s Certbot 2.0 now defaults to ECDSA P-256 certificates for all new issuances, pushing an enormous volume of web authentication toward ECC even on infrastructure that historically used RSA.
Passkeys – the industry’s bet on passwordless authentication – run entirely on ECC. The FIDO Alliance reported over 3 billion passkeys securing consumer accounts as of late 2025. The W3C WebAuthn specification uses COSE algorithm identifiers, with ES256 (ECDSA P-256) as the default and most widely supported algorithm. Apple’s Secure Enclave and Android’s Keystore both use ECDSA P-256 for hardware-backed key operations. A CRQC capable of ECDLP doesn’t just break encryption – it turns every passkey-protected account into an impersonation vector.
EMV chip cards increasingly use ECDSA for offline data authentication, and over 1 billion ePassports in circulation across more than 140 countries rely on digital signatures – using either ECDSA or RSA, depending on the issuing country. ICAO Doc 9303 governs the cryptographic framework.
Cryptocurrency represents perhaps the largest single deployment of a single ECC curve. Bitcoin and Ethereum both use secp256k1 for all transaction signing. Bitcoin has generated over 1 billion cumulative addresses; Ethereum exceeds 300 million unique addresses. Bitcoin’s BIP 340 standardized Schnorr signatures for secp256k1 via the Taproot upgrade in November 2021 – but Schnorr still relies on the same elliptic curve group, so Shor’s ECDLP attack applies identically.
End-to-end encrypted messaging – Signal Protocol (used by Signal, WhatsApp’s 2+ billion users, and Facebook Messenger) – relies on X25519 ECDH for the Double Ratchet key agreement. WireGuard VPN uses X25519 exclusively. Apple’s iMessage has layered post-quantum protection on top, but the classical layer remains ECC.
RSA: the legacy anchor
RSA’s footprint is shifting but remains massive. Approximately 75% of TLS certificates still carry RSA public keys, even as ECDSA adoption accelerates. The most popular TLS cipher suite – ECDHE-RSA-AES128-GCM-SHA256 – combines ECDHE key exchange with RSA authentication, meaning a quantum attacker would need to break both to fully compromise a session (ECDH for confidentiality, RSA for authentication).
Beyond web TLS, RSA anchors enterprise infrastructure extensively. Microsoft Active Directory Certificate Services defaults to RSA. PGP/GPG defaults to RSA-3072 or RSA-4096 for key generation. Government PKI systems – US Federal PIV, EU eIDAS – still extensively issue RSA-2048 certificates, though migration guidelines are tightening. On SSH, RSA and Ed25519 have reached near-parity on GitHub at approximately 48% each as of January 2025, but RSA still dominates on GitLab (~76%) and Launchpad (~90%).
Diffie-Hellman: ECDH won, classical DH is fading
Classical finite-field DHE accounts for only 1–2% of TLS connections, vastly overshadowed by ECDHE. IPsec/IKEv2 deployments increasingly use ECDH groups. The practical quantum threat to Diffie-Hellman is overwhelmingly an ECDH story, which means it follows the ECDLP cost model – the more vulnerable one.
The quantum resource comparison table
The following table synthesizes the best available estimates. All estimates assume surface code error correction with physical error rate $$10^{-3}$$ unless noted.
| Target | Logical qubits | Toffoli gates | Physical qubits | Runtime | Source |
|---|---|---|---|---|---|
| RSA-2048 | ~1,409 | ~6.5 × 10⁹ | ~898,000 (surface) | ~5 days | Gidney 2025 |
| RSA-2048 | ~1,409 | ~6.5 × 10⁹ | <100,000 (qLDPC) | ~1 month | Pinnacle 2026 |
| RSA-2048 | ~6,189 | ~2.6 × 10⁹ | ~20 million | ~8 hours | Gidney-Ekerå 2021 |
| RSA-3072 | ~6,146 | ~1.86 × 10¹³ | ~30–40 million | ~15–20 hrs | Roetteler 2017 (interpolated) |
| ECDSA P-256 | 2,330 | 1.26 × 10¹¹ | ~5.8 million | Hours–days | Roetteler 2017 / Ha 2024 |
| ECDSA P-256 | ~2,124 | 119× fewer T | — | — | Häner 2020 |
| ECDSA P-256 | — | — | 126,133 (cat qubits) | 9 hours | Gouzien 2023 |
| secp256k1 | 2,330 | 1.26 × 10¹¹ | 317M (1hr) / 13M (1d) | 1 hr–1 day | Webber 2022 |
| DH-2048 | ~1,409–6,189 | ~2.6–6.5 × 10⁹ | Same as RSA-2048 | Same as RSA-2048 | Gidney-Ekerå / Gidney 2025 |
The Microsoft Azure Quantum Resource Estimator comparison is particularly valuable for its cross-platform consistency. Under identical “reasonable” superconducting assumptions: ECC P-256 needs 5.87 million qubits (21 hours) versus RSA-2048’s 25.17 million qubits (1 day). Under “optimistic” assumptions: 1.54 million and 11 hours versus 5.83 million and 12 hours. ECC consistently requires roughly 4× fewer physical qubits across every hardware scenario – despite providing stronger classical security.
The symmetric footnote: why Grover isn’t the problem
No discussion of quantum threats to cryptography is complete without addressing Grover’s algorithm and symmetric ciphers. The distinction matters: Shor poses an existential threat to public-key cryptography. Grover poses a manageable speedup against symmetric primitives.
Grover’s search algorithm reduces exhaustive key search from $$O(2^k)$$ to approximately $$O(2^{k/2})$$ operations in the idealized query model. AES-256 retains an effective 128-bit security level under quantum attack – still astronomically large. AES-128 drops to an effective 64 bits, which is theoretically marginal but practically unexploitable: Grover’s iterations are inherently sequential and cannot be efficiently parallelized.
ETSI’s TR 103 967 (2025) addresses this directly, analyzing the practicality of parallelizing Grover’s algorithm given circuit depth limits, error correction overhead, and realistic hardware constraints. Their conclusion: even with 6,000 perfect logical qubits, breaking AES-256 via Grover would take on the order of $$10^{32}$$ years. The simple mitigation – use AES-256, which CNSA 2.0 already mandates – resolves the symmetric threat entirely.
The practical implication is clear: symmetric cryptography is not the urgent migration target. The urgency is entirely on the public-key side – and within the public-key side, the urgency is greatest for ECC.
The standards are ready. The clocks are running.
What NIST has finalized
NIST finalized three post-quantum standards on August 13, 2024, effective the following day (My write-up on those: Post-Quantum Cryptography (PQC) Standardization – 2025 Update):
FIPS 203 – ML-KEM (Module-Lattice Key Encapsulation Mechanism, formerly CRYSTALS-Kyber) replaces ECDH/DH key exchange using module-LWE lattice problems, with parameter sets ML-KEM-512, ML-KEM-768, and ML-KEM-1024.
FIPS 204 – ML-DSA (Module-Lattice Digital Signature Algorithm, formerly CRYSTALS-Dilithium) replaces ECDSA/RSA digital signatures.
FIPS 205 – SLH-DSA (Stateless Hash-Based Digital Signature Standard, formerly SPHINCS+) provides a conservative hash-based signature backup.
HQC – a code-based backup KEM – was selected for standardization in March 2025, with a final FIPS standard expected around 2027. FN-DSA (FIPS 206, based on FALCON) remains in draft; its finalization has been delayed due to implementation complexity, and the NSA has indicated it does not anticipate adding FN-DSA to the CNSA 2.0 suite.
Real-world deployment is already happening
Hybrid post-quantum key exchange – combining classical ECDH with post-quantum ML-KEM – is no longer experimental. Cloudflare reported that hybrid X25519+ML-KEM-768 TLS key exchange crossed 50% of their traffic by late October 2025, driven by Chrome’s default enablement and Apple’s iOS/macOS enabling post-quantum hybrid by default in September 2025. By year-end 2025, the figure had climbed well past 60%.
Chrome ships X25519MLKEM768 as its default key share. The hybrid approach ensures that even if one algorithm has an undiscovered flaw, the other protects confidentiality – a belt-and-suspenders strategy that eliminates the need for organizations to make a binary bet.
The compliance deadlines
The NSA’s CNSA 2.0 guidance for National Security Systems sets hard deadlines. Software and firmware signing must support CNSA 2.0 algorithms by 2025 and use them exclusively by 2030. Web browsers, servers, and cloud infrastructure must support by 2025 and use exclusively by 2033. Traditional networking equipment transitions by 2026/2030. Full transition across all NSS completes by 2035. New NSS acquisitions must be CNSA 2.0 compliant by January 1, 2027.
The UK NCSC’s migration roadmap sets phased milestones – identify by 2028, prioritize by 2031, complete by 2035. Australia mandates no traditional asymmetric crypto beyond 2030. These are not predictions of when a CRQC will arrive. They are deadlines for when systems must be protected regardless.
The migration blind spot
Most PQC migration playbooks prioritize RSA deprecation – because RSA is older, more visible in legacy systems, and was already being phased out for performance reasons. This focus risks a dangerous blind spot on ECC, which is simultaneously more ubiquitous in modern systems and more quantum-vulnerable.
The HNDL calculus favors ECC urgency
The harvest-now-decrypt-later threat is the most immediate quantum risk, and it falls disproportionately on key exchange. Every ECDHE key exchange in a TLS 1.3 session is a potential HNDL target: an adversary recording encrypted traffic today can decrypt it once a CRQC arrives. Since breaking ECDH P-256 requires fewer quantum resources than breaking RSA-2048, the HNDL window for ECC-protected data is potentially shorter – meaning the data you’re protecting today may be exposed sooner than you expect.
The data at stake ranges from authentication credentials and session contents to proprietary communications and regulated personal data. Compliance frameworks from PCI DSS to GDPR impose no expiration date on the obligation to protect sensitive data – and neither does an adversary’s ability to wait.
The signature forgery threat is slower but structural
For digital signatures, the threat model is “trust-now-forge-later.” ECDSA signatures on software updates, firmware images, EMV offline data authentication, and legal documents could be forged retroactively. With 3+ billion passkeys, over 1 billion cryptocurrency addresses, and every TLS 1.3 handshake depending on ECC, the attack surface for signature forgery is vast.
Organizations with long-lived trust anchors are particularly exposed. Root CA certificates, code signing keys, HSM firmware signing keys, and FIDO attestation certificates – all of these create chains of trust that extend years or decades into the future.
Concrete steps for security leaders
Conduct a dual-track cryptographic inventory. Do not treat RSA and ECC migration as a single workstream. Map every system that uses ECC – TLS endpoints, passkey infrastructure, HSMs, code signing, VPN gateways, IoT device identity – separately from RSA-dependent systems. Prioritize them independently based on data sensitivity and retention requirements.
Migrate key exchange first. Deploy hybrid X25519+ML-KEM-768 for all TLS connections handling sensitive data. This is production-ready today – over half of global TLS traffic to major CDNs already uses it. For external APIs and internal service-to-service communication, evaluate ML-KEM-768 as your primary key encapsulation mechanism.
Audit your HNDL exposure window. For every system that processes data requiring confidentiality beyond 2035, calculate the gap between your expected migration completion date and the earliest plausible CRQC timeline. If the gap is negative – meaning data needs to be secret for longer than you have before a potential CRQC – that system needs hybrid key exchange now.
Plan for signature migration, but don’t delay key exchange for it. ML-DSA (FIPS 204) is ready for deployment. Begin testing in non-production environments and plan for integration into certificate issuance, code signing, and message authentication. But do not let the complexity of signature migration slow down your key exchange migration.
Pressure your vendors. HSM vendors, cloud providers, identity platforms, certificate authorities — every link in your cryptographic supply chain needs a PQC migration roadmap. Ask for specific timelines, not vague commitments.
The paradox at the bottom
The quantum security inversion is a cautionary tale about optimizing for the wrong threat model. ECC was adopted across the industry for excellent reasons: smaller keys meant faster handshakes, lower bandwidth, better performance on constrained devices. Every argument for ECC over RSA was correct – in a world without quantum computers.
Shor’s algorithm does not respect our classical security equivalences. It respects only the size of the numbers involved. The same property that made 256-bit keys sufficient against classical adversaries – the extraordinary hardness of ECDLP per bit – is precisely what makes them insufficient against quantum ones. The cryptographic systems most praised for their classical elegance are those most exposed to quantum attack. The comparison is complicated by the uneven pace of circuit optimization – RSA factoring has received a decade of relentless engineering attention that ECDLP circuits have not – but the structural vulnerability remains: at any given classical security level, ECC presents a smaller quantum target.
The post-quantum standards are finalized. The hybrid implementations are in production. Over half of TLS traffic to major CDNs is already quantum-protected. The technical barriers to migration are lower than they have ever been.
The question is no longer whether your ECC infrastructure is vulnerable. It is whether you will migrate it before someone records the traffic that proves it.
Quantum Upside & Quantum Risk - Handled
My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.