Post-Quantum, PQC, Quantum Security
Trending

PQC Signature Migration Before Encryption: Why Trust Infrastructure Comes First

The SolarWinds Shortcut

In December 2020, the security industry watched in slow-motion horror as the SolarWinds attack unfolded. A nation-state adversary had compromised the build system of a trusted software vendor, injected malicious code into a routine update, and signed it with SolarWinds’ legitimate code-signing certificate. The signed update shipped to 18,000 organizations. Because the signature checked out, automated update systems did exactly what they were designed to do: they trusted it.

The operation was a masterclass in patience and operational craft. It took months of careful infiltration, social engineering, and painstaking stealth to compromise that build pipeline. One wrong move and the intrusion would have been detected before the payload ever reached customers.

Now consider what a cryptographically relevant quantum computer (CRQC) does to this equation. The attacker no longer needs to compromise the build system. The code-signing private key is mathematically derived from the public key, which is, by definition, public. The months of infiltration, the social engineering, the operational risk of discovery — all of it collapses to a single computation. Obtain the public key. Run Shor’s algorithm. Sign whatever you want. Push it through the legitimate update channel.

The same SolarWinds-scale attack. No infiltration required.

This thought experiment is why, after years of leading PQC migrations at enterprise scale and decades of offensive security work, I have come to a conclusion that runs against the grain of most migration guidance: signature migration should be sequenced before, or at minimum alongside, encryption migration in every PQC program plan. The conventional wisdom of “protect data first” is understandable, but it is wrong for most organizations.

The Conventional Wisdom and Its Hidden Assumption

The data-first approach to PQC migration has a clear logic behind it, and I want to acknowledge that before arguing against it.

The Harvest Now, Decrypt Later (HNDL) threat is active today. Nation-state adversaries are intercepting and storing encrypted traffic, betting that a future quantum computer will let them decrypt it. For data with long-term sensitivity (diplomatic communications, health records, trade secrets, intelligence assessments), every day of delay adds another day’s worth of harvestable material. That material can never be un-harvested.

On the practical side, key exchange migration has a head start. Hybrid TLS using X25519Kyber768 (now X25519MLKEM768) has been deployed in production by Google, Cloudflare, and AWS. Chrome defaults to it. The tooling is mature. The performance overhead is negligible. You can protect the confidentiality of new sessions today with a configuration change.

Most migration guides, including early guidance from CSA and others, default to this framing: protect data in transit first, because the threat is active and the tooling is ready.

But there is a buried assumption in this approach: that confidentiality compromise is more damaging than integrity and authenticity compromise. For intelligence agencies and organizations that handle classified material with multi-decade sensitivity requirements, that assumption may hold. For most enterprises, where the operational continuity of software supply chains, identity systems, and device fleets matters more than retroactive data exposure, it is exactly backwards.

Ask yourself this question: If you could only protect one thing today, would you protect last year’s encrypted data from future decryption, or would you protect your code-signing infrastructure from future forgery? For most CISOs, the answer, once they think it through, is clear. The encrypted data, once exposed, causes a bounded harm. The forged signing key opens the door to harm that has no natural boundary.

Why Signatures Are the Bigger Threat

I first articulated the signature-side quantum threat in 2018 under the name Sign Today, Forge Tomorrow (STFT), which later started gaining traction in the industry as Trust Now, Forge Later (TNFL) thesis. The core argument is simple but its implications run deep.

HNDL is a slow burn. An adversary who eventually decrypts harvested data learns what was said in the past. The information may be valuable, but the damage is bounded and retrospective. Each session requires separate decryption effort. The exposure accumulates gradually. And for sessions protected by perfect forward secrecy, the scope is limited further.

TNFL is a cliff. An adversary who can forge a single code-signing key gains the ability to distribute malware through every legitimate update channel that trusts that key. One forged certificate authority private key renders every certificate issued under that CA untrustworthy, meaning any domain can be impersonated. A forged firmware signing key enables persistent hardware-level compromise across every device from that vendor. A forged identity provider key means any user identity can be impersonated across every system that trusts that provider.

Q-Day for encryption will feel like a gradual erosion, with individual datasets decrypted one at a time, each requiring significant compute, and impacts emerging over months or years as adversaries work through their backlog. Q-Day for signatures will feel like a cliff: sudden, systemic, and cascading. The moment an adversary can derive a signing key, they can act immediately, at any scale they choose, against any system that trusts that key.

As I wrote in Quantum Sovereignty, the participants in tabletop exercises I have conducted on quantum-enabled signature forgery, structured simulations with government and enterprise security leaders, consistently describe the outcomes as “truly sobering.” The exercise walks participants through a scenario where a state-level adversary has forged a firmware signing key for a widely deployed industrial control system. The malicious update ships through the legitimate vendor channel. Operators install it without hesitation because the signature verifies. From that single point of compromise, the cascading consequences quickly overwhelm incident response capabilities.

Participants enter these exercises thinking of quantum risk as a data confidentiality problem. They leave understanding it as a systemic trust problem. The revelation is always the same: they had no idea how many automated systems accept signed inputs without human verification, and how rapidly a signature compromise propagates through interconnected infrastructure.

The Numbers Have Arrived, and They Favor the Attacker

For years, the TNFL thesis was a strategic argument without precise resource estimates. That changed in 2026 with two papers that should be required reading for every CISO making migration sequencing decisions.

The first is the Chevignard, Fouque, and Schrottenloher paper presented at EUROCRYPT 2026, which demonstrated that breaking P-256 ECDLP (the elliptic curve underpinning ECDSA signatures across WebPKI, code signing, JWT, and FIDO2) requires only 1,193 logical qubits. P-224 falls at 1,098. These are not large numbers by the standards of current quantum computing roadmaps.

The second is the Google Quantum AI whitepaper, which showed that breaking ECC-256 (specifically secp256k1, the curve used in Bitcoin and Ethereum signatures) could be accomplished with fewer than 500,000 superconducting physical qubits in approximately 9 minutes. Google’s team also introduced a zero-knowledge proof to validate their resource estimates without disclosing attack details, a responsible disclosure model that itself signals how seriously they take the proximity of this capability. The 9-minute figure has a particularly sharp implication: on a fast-clock superconducting architecture, an attacker could potentially derive a private key and forge a transaction within the average block time of Bitcoin. This is not bulk decryption of historical data. This is real-time forgery, the quintessential TNFL scenario, realized in a financial system that secures over a trillion dollars in assets.

Compare these figures to the best current estimates for breaking RSA-2048, the workhorse of encryption. Gidney’s 2025 paper brought the estimate down to under 1 million physical qubits and under one week of runtime, using 1,399 logical qubits. That was itself a dramatic reduction from the previous estimate of 20 million qubits.

The irony is painful. For two decades, the industry migrated toward ECC and away from RSA because ECC offered shorter keys, better performance, and equivalent classical security. ECDSA P-256 became the default signature algorithm for WebPKI certificates, code signing, mobile authentication, and identity federation. The very algorithm the industry adopted as its modern trust foundation is the one most efficiently attacked by a quantum computer. The signatures protecting modern infrastructure are more quantum-vulnerable than the encryption they replaced.

This is something I have been arguing since 2018: that ECC has been dangerously under-researched relative to RSA in quantum cryptanalysis, and that the first practical quantum attacks would likely target signatures rather than encryption. The Chevignard and Google papers have validated that position with hard numbers. As the Google team noted in their whitepaper, ECDLP-focused quantum algorithms have received far less research attention than RSA-focused ones, meaning the estimates for ECDLP are likely farther from optimal and will continue to improve.

For readers tracking quantum progress through my CRQC Quantum Capability Framework, the resource gap between breaking ECC signatures and RSA encryption means that the quantum capability threshold for undermining trust infrastructure arrives earlier than the threshold for bulk decryption.

The Cascade Asymmetry: What Actually Breaks

The resource estimates tell you which falls first. The blast radius analysis tells you which falling matters more.

When an encryption key is compromised in an HNDL scenario, the attacker can read previously captured sessions. Each session requires separate decryption effort and a substantial energy expenditure. The damage is bounded by the volume of traffic the adversary captured and the compute resources they allocate to decryption. Perfect forward secrecy in TLS further limits the scope: compromising one session key does not compromise others.

When a signing key is compromised, the damage multiplies in every direction.

A compromised code-signing key turns the SolarWinds scenario from a multi-million-dollar intelligence operation into a mathematical computation. Every device that trusts that signing key will accept malicious updates. A compromised certificate authority key means every certificate issued under that CA becomes suspect, and the attacker can issue new certificates for any domain they choose, enabling phishing at scale, man-in-the-middle interception of financial transactions, and impersonation of any website on the internet. A compromised firmware signing key enables persistent, hardware-level compromise across entire device fleets, the kind of compromise that survives factory resets and operating system reinstallations. A compromised identity provider signing key (the SAML or OAuth or JWT signing key) means the attacker can mint authentication tokens for any user in any system that trusts that provider, bypassing every access control, multi-factor check, and audit trail downstream.

Consider what this means in practice. An enterprise with 50,000 employees using a federated identity provider with ECDSA-signed SAML assertions has, today, a single point of cryptographic failure. If the signing key is derived, the attacker can log in as the CEO, as the CFO, as any system administrator, and generate authentication artifacts that are indistinguishable from legitimate ones. No credential stuffing, no phishing, no MFA bypass needed. The mathematics are the authentication.

The Shor’s algorithm impact on RSA, ECC, and Diffie-Hellman does not discriminate by use case, but the consequences of compromise do. A decrypted TLS session reveals what was communicated. A forged signing key lets the attacker become the communicator.

The asymmetry goes further. Encryption compromise exposes past data. Signature compromise enables future active attacks, at a time and target of the attacker’s choosing, with a payload of their selection. The attacker is not passively reading old traffic. They are actively forging trust, inserting themselves into the living infrastructure of the organization.

In 30 years of offensive security work (penetration testing and red teaming of critical national infrastructure, defense systems, and Fortune Global 500 enterprises) I have never seen a data exposure cause as much operational damage as a compromised trust anchor. Data breaches are expensive, embarrassing, and regulatory nightmares. But a compromised signing key that enables supply chain attacks, identity impersonation, and firmware manipulation can cause physical-world consequences that make a data breach look manageable.

For anyone who works in environments where digital trust intersects with physical systems (industrial control systems, medical devices, automotive firmware, energy infrastructure) the stakes of signature compromise extend well beyond IT. This is the territory I explored in Cyber-Kinetic Security: the boundary where digital integrity failures produce physical consequences.

The Operational Argument: Harder Migrations Need More Lead Time

If the threat argument establishes that signature compromise is more severe, the operational argument follows naturally: signature migration is also harder than key-exchange migration, which means it needs more lead time, not less.

Hybrid TLS key exchange is, comparatively, straightforward. You update the TLS library on both ends of the connection, negotiate the hybrid key agreement, and the session is protected. The endpoints are typically under your control. The certificate infrastructure stays the same. The configuration change is, in many environments, a single flag.

Signature migration touches a fundamentally different set of problems.

Certificate size inflation. An ML-DSA-65 (FIPS 204, formerly CRYSTALS-Dilithium) public key is 1,952 bytes and its signature is 3,309 bytes. An ECDSA P-256 public key is 64 bytes and its signature is 64 bytes. A typical WebPKI certificate chain currently adds roughly 3-5 KB to a TLS handshake; the same chain using ML-DSA would add approximately 14.7 KB. That is why Google and Cloudflare are pursuing Merkle Tree Certificates — a fundamental redesign of WebPKI authentication — rather than simply dropping ML-DSA into existing X.509 structures. The fact that the industry’s two largest TLS operators concluded that the existing certificate architecture cannot absorb post-quantum signatures without redesign should tell every CISO something about the complexity of this migration.

HSM support gaps. Not all hardware security modules support ML-DSA yet. AWS KMS began offering ML-DSA for code signing operations via AWS Private CA in late 2025. Microsoft announced ML-DSA support in Windows Server 2025 and Windows 11, with Active Directory Certificate Services PQC support targeted for 2026. But many organizations run HSM-backed signing infrastructure from vendors whose PQC roadmaps are less clear. If your code-signing or CA private keys live in an HSM that cannot generate or use ML-DSA keys, your migration is blocked at the hardware level.

WebPKI coordination. For publicly trusted TLS certificates, the migration is blocked by a dependency chain that no individual organization controls. The CA/Browser Forum must update its Baseline Requirements to permit ML-DSA. DigiCert is drafting a ballot to enable this, but the timeline is 2026-2027 at the earliest. Meanwhile, RFC 9881 (published October 2025) defined the X.509 algorithm identifiers for ML-DSA, giving the IETF side of the plumbing a foundation to build on. For private PKI (your internal certificate authority) you can move now, and you should.

Embedded device constraints. Devices with 15-25 year operational lifetimes and non-upgradeable signature verification represent a category of risk that has no equivalent on the encryption side. If a device has RSA or ECDSA verification fused into hardware with no mechanism for algorithm update, that device will become unsecurable once a CRQC exists. These devices are being deployed today, and every unit shipped with hardcoded classical-only verification is a liability that compounds over time.

Long-lived signed artifacts. Firmware images, signed software packages, legal documents, and contracts that were signed with ECDSA or RSA cannot be retroactively re-signed with PQC algorithms. If the validity of those signatures is ever challenged (and a CRQC makes challenge trivial) there is no remediation path. Encryption protects data at a moment in time; signatures establish trust over extended periods.

Starting with the easy part (hybrid TLS) and deferring the hard part (signature infrastructure) is the classic engineering mistake of optimizing for early wins over the critical path. It is the equivalent of paving the driveway while ignoring cracks in the foundation. I documented the full scale of PQC migration complexity, including the 120,000-task reference from an enterprise migration I led, precisely to counter the notion that this is a simple configuration exercise. The signature components of that migration were consistently the ones that required the longest lead times, the most vendor coordination, and the most careful sequencing. The telecom PQC migration challenges I have written about separately illustrate similar patterns at infrastructure scale.

What CNSA 2.0 Already Tells You

Sometimes the most informative signal is not what an organization says, but what it does first.

The NSA’s CNSA 2.0 suite, the post-quantum algorithm mandate for U.S. National Security Systems, sets differentiated timelines by use case. Software and firmware signing was the first category prioritized, with organizations encouraged to adopt hash-based signatures (LMS and XMSS, per NIST SP 800-208) immediately, full adoption required by 2025, and completion by 2030. Web and cloud services follow, with a 2025 preferred date and 2033 mandatory date. Network equipment and operating systems come after that.

The NSA put code signing first. Not key exchange. Not encryption at rest. Signing.

The reasoning is transparent: code signing establishes the root of trust. If the signing infrastructure is compromised, every other security control downstream of it can be subverted. You cannot build quantum-resistant encryption on top of quantum-vulnerable trust anchors. The NSA understood this and sequenced accordingly, and commercial CISOs would do well to follow the same logic.

And critically, hash-based signatures for firmware and code signing are deployable today. LMS and XMSS have been standardized since NIST SP 800-208 (published 2020). They require careful state management (they are stateful schemes, unlike ML-DSA), but for environments where HSMs manage the signing keys (which is every environment that takes code signing seriously) statefulness is a tractable engineering problem. AWS KMS now supports ML-DSA signing operations, and AWS Private CA can issue ML-DSA code-signing certificates. You do not need to wait for the CA/Browser Forum, for composite certificate standards, or for WebPKI ecosystem coordination. You control both the signing key and the verification infrastructure for your own software and firmware.

The gap between what is already deployable and what most organizations have actually deployed is striking. LMS and XMSS have been available for five years. ML-DSA code signing via cloud HSMs has been available since late 2025. Yet the vast majority of enterprises are still signing every firmware update, every software release, and every container image with ECDSA or RSA keys that a sufficiently capable quantum computer could derive. The tooling exists. The standards exist. The regulatory pressure exists. What is missing is the prioritization decision — and that decision is what this article is about.

This is one area where the regulatory deadlines that I have argued should drive action more than Q-Day predictions are already binding. If you are a vendor selling into NSS environments, CNSA 2.0 compliance is not optional. If you are a commercial CISO who looks to NSA guidance as a bellwether for threat prioritization (and you should), the sequencing decision has already been made for you. The complete US PQC regulatory framework makes this timeline explicit.

What CISOs Should Do Now

I recognize that arguing for signature-first migration sequencing is operationally challenging advice. It is easier to deploy hybrid TLS than to overhaul signing infrastructure. But easier is not the same as more important. Here is how I recommend CISOs approach the sequencing decision.

Inventory signature dependencies from the start. Most cryptographic inventories, the foundational first step in any PQC migration plan, focus on TLS endpoints and data-at-rest encryption. Expand the scope to include code-signing certificates and keys, firmware signing infrastructure, certificate authority dependencies (both internal and external), identity provider signing keys (SAML, OAuth, JWT), container image signing, and SBOM attestation chains. The PQC Readiness Self-Assessment Scorecard includes signature infrastructure in its scope for exactly this reason.

Deploy LMS or XMSS for firmware and code signing now. This is the highest-confidence action you can take today. The algorithms are standardized and CNSA 2.0-approved. The deployment model is self-contained: you control the signing key and you control the verification infrastructure. No ecosystem coordination required. If your HSM supports these algorithms (and major vendors including AWS KMS now do), you can begin signing firmware and software updates with quantum-resistant algorithms this quarter.

Pilot ML-DSA certificates on internal PKI. Your internal certificate authority is not subject to CA/Browser Forum rules. AWS Private CA, DigiCert Trust Lifecycle Manager, and OpenSSL 3.5+ with the OQS provider all support ML-DSA certificate issuance today. Stand up a parallel ML-DSA CA internally, issue certificates for non-production systems, and begin testing your infrastructure’s compatibility with the larger certificate sizes and different signature verification paths. The introduction to crypto-agility I published provides the architectural foundation for this kind of parallel deployment.

Require PQC signature roadmaps from vendors. Any vendor that signs firmware updates for devices deployed in your environment should have a documented PQC signature migration plan. If they do not, that is a procurement risk that your vendor risk management process should capture. The question is simple: “When will your firmware updates be signed with a quantum-resistant algorithm?” If the answer is vague, push harder.

Identify and flag non-upgradeable signature verification. Survey your device fleet for any hardware with fused RSA or ECDSA verification logic and no mechanism for algorithm update. These devices have a finite remaining useful life bounded by quantum computing progress. Quantify the exposure, build the replacement timeline into your capital planning, and stop procuring devices without PQC-capable signature verification paths.

Run hybrid TLS migration in parallel. To be clear: nothing in this argument says you should stop or slow your encryption-side migration. Hybrid TLS is ready, the tooling is mature, and deploying it protects the confidentiality of new sessions against future quantum decryption. Do both. The argument is about where you invest your scarcest resources (leadership attention, architecture planning time, vendor coordination effort) and the answer is that the signature side needs those resources more urgently than the encryption side.

The PQC Migration Framework at PQCFramework.com provides the structured methodology for both workstreams, and Practical Steps to Quantum Readiness gives the actionable starting points. For a deeper treatment of organizational quantum readiness strategy, including how to structure the business case for board-level approval, Quantum Ready covers the full picture.

The Conventional Wisdom Was Right — Until It Wasn’t

The data-first approach to PQC migration made sense when HNDL was the only quantum threat most people understood, when ECC’s quantum vulnerability was under-researched, and when the resource estimates for breaking different algorithm families were too imprecise to compare.

None of those conditions hold any longer. The Chevignard, Fouque, and Schrottenloher paper demonstrated that P-256 ECDSA — the signature algorithm protecting the majority of the internet’s trust infrastructure — falls at 1,193 logical qubits. Google’s team showed that breaking secp256k1 signatures requires fewer than 500,000 physical superconducting qubits and approximately 9 minutes of runtime. The quantum resource threshold for forging signatures arrives before the threshold for breaking RSA encryption. The algorithm the industry chose as its modern signature standard is the most quantum-vulnerable algorithm in production use.

The NSA recognized this and prioritized code signing in CNSA 2.0. Google and Cloudflare recognized the challenge and are redesigning WebPKI authentication from the ground up with Merkle Tree Certificates. AWS and Microsoft have begun shipping ML-DSA signing capabilities in their cloud infrastructure. The ecosystem is moving on the signature side — the question is whether your organization is moving with it.

Every month you spend on TLS key exchange without parallel investment in signature migration is a month where your highest-consequence attack surface receives the least attention. I am not arguing against encrypting your traffic. I am arguing against the sequencing decision that treats encryption as the sole priority and defers signatures to “later.” In PQC migration, later arrives sooner than anyone expects, and the hardest parts take the longest. Forget the Q-Day predictions — the deadlines are already set, and the signature deadlines come first.

Your data is at risk. Your trust infrastructure is at greater risk. Start with trust.

In a forthcoming companion article, The Signature Supply Chain: How Deep Does the Trust Go?, I will map the full technical anatomy of what “signature infrastructure” actually means, covering every layer from WebPKI root CAs through code signing, firmware, identity federation, container images, and SBOM attestation. Understanding the depth of signature dependency across modern infrastructure is the prerequisite for understanding the scale of what must be migrated.

Quantum Upside & Quantum Risk - Handled

My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.

Talk to me Contact Applied Quantum

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven consulting firm empowering organizations to seize quantum opportunities and proactively defend against quantum threats. A former quantum entrepreneur, I’ve previously served as a Fortune Global 500 CISO, CTO, Big 4 partner, and leader at Accenture and IBM. Throughout my career, I’ve specialized in managing emerging tech risks, building and leading innovation labs focused on quantum security, AI security, and cyber-kinetic risks for global corporations, governments, and defense agencies. I regularly share insights on quantum technologies and emerging-tech cybersecurity at PostQuantum.com.
Share via
Copy link
Powered by Social Snap