Quantum Security & PQC News

Google Just Drew a Line in the Sand: PQC Migration by 2029

25 Mar 2026 – The company building the quantum computer is telling you the clock is running out. That should get your attention.

On March 25, 2026, Google published a brief but consequential blog post authored by Heather Adkins, VP of Security Engineering, and Sophie Schmieg, Senior Staff Cryptography Engineer. Buried beneath a deliberately understated title – “Quantum frontiers may be closer than they appear” – was a commitment that should ripple across every CISO’s desk and every board risk committee on the planet: Google is targeting 2029 to complete its migration to post-quantum cryptography.

Not 2035, the outer bound of NIST’s disallowance timeline. Not 2030, when NIST plans to deprecate legacy algorithms. 2029 – a full year ahead of the official deprecation date, and six years ahead of the final deadline.

This is not a company hedging its bets or issuing vague guidance about “starting to plan.” This is Google, which operates the world’s most widely used browser, one of the two dominant mobile operating systems, and a major cloud infrastructure platform, announcing a hard target for completing one of the most complex cryptographic transitions in computing history. And it’s doing so while simultaneously running one of the world’s most advanced quantum computing research programs – a detail that deserves far more scrutiny than it has received.

The Dual Position No One Else Occupies

To understand why Google’s announcement matters more than a typical corporate security roadmap update, you have to appreciate the company’s unique, and somewhat uncomfortable, dual role in the quantum landscape.

Google Quantum AI has been targeting the construction of a useful, error-corrected quantum computer by 2029 since CEO Sundar Pichai announced that goal at Google I/O in 2021. The Willow chip, unveiled in late 2024, demonstrated below-threshold quantum error correction for the first time – a milestone that moved the company measurably closer to that goal. And in May 2025, Google researchers published a preprint showing that 2048-bit RSA could theoretically be broken by a quantum computer with just 1 million noisy qubits running for one week – a 20-fold reduction from the team’s previous estimate of 20 million qubits.

So when Google’s security team sets a PQC migration deadline of 2029, this isn’t arbitrary. It is a company that has better visibility into the trajectory of quantum hardware than virtually any other organization on Earth, telling you that the timeline for a cryptanalytically relevant quantum computer (CRQC) has compressed enough to warrant completing migration before the end of this decade.

Google, to its credit, frames the 2029 target carefully. The blog post states the timeline reflects “migration needs for the PQC era in light of progress on quantum computing hardware development, quantum error correction, and quantum factoring resource estimates.” The company is not saying a CRQC will arrive in 2029. But the implicit message is clear: by Google’s internal calculus, the risk has become sufficient that completing migration by then is the rational move.

For anyone still working off assumptions that a CRQC is “15 to 20 years away” – a refrain that has been echoing since roughly 2015 and hasn’t updated with the evidence – this should serve as a reality check.

Why Authentication Got Bumped to the Front of the Queue

The most technically significant detail in Google’s announcement isn’t the timeline itself. It’s a single sentence buried midway through the post: Google says it has “adjusted” its threat model to prioritize PQC migration for authentication services, meaning digital signatures, and recommends that other engineering teams do the same.

A note of editorial precision is warranted here. The threat model document Google links to was published in March 2024 – two years ago. That document organized PQC priorities around SNDL risk for encryption-in-transit first, with firmware and software signatures as subsequent concerns. Today’s blog post claims an adjustment to that model, elevating authentication services, but the linked document hasn’t been visibly updated and no new threat model document accompanies the announcement. What we’re working with, in other words, is a declaration of intent – significant because of who’s making it, but thin on the analytical detail you’d expect to support such a fundamental reprioritization.

Still, the directional shift matters. For years, the conventional wisdom has been straightforward: encryption (specifically, key exchange) is the most urgent PQC migration target because of store-now-decrypt-later (SNDL) attacks. An adversary harvesting encrypted traffic today can decrypt it once a quantum computer becomes available. Digital signatures, by contrast, were generally viewed as a future-facing problem – they only become vulnerable when a CRQC actually exists, because forging a signature requires real-time access to quantum computation.

Google’s stated reprioritization suggests the company’s internal threat modeling has concluded that the signature problem is more urgent than the conventional framing implies. There are good reasons for this. Signature keys tend to be long-lived, deeply embedded across infrastructure, and far harder to rotate than ephemeral encryption keys. A CRQC doesn’t need to forge every signature ever created – it just needs to forge the right one. An attacker with access to even limited quantum computation might target high-value signing keys (root certificates, code signing keys, firmware signing keys) rather than attempting to brute-force terabytes of archived ciphertext.

The strategic implication is significant. As Google’s own research noted in its security blog last May, signature keys are “more attractive targets to attack, especially when compute time on a quantum computer is a limited resource.” The first generation of CRQCs won’t be unlimited in capacity. They will be expensive, constrained machines. And a rational adversary will use scarce quantum resources where they create maximum leverage – which is authentication and signing infrastructure, not bulk decryption.

For my regular readers, this claimed reprioritization, however thinly documented, will sound familiar. I first articulated the signature threat as a distinct and arguably more dangerous quantum risk category back in 2018, coining the term “Sign Today, Forge Tomorrow” (STFT) – a concept he later expanded under the now-gaining-traction label “Trust Now, Forge Later” (TNFL). Where Google offers a single sentence about an “adjusted” threat model linking to a two-year-old document, I have published eight years of detailed analysis arguing that forged signatures strike at integrity and authenticity – which in many domains, especially operational technology and critical infrastructure, are more consequential than confidentiality. A compromised signing key in an OT environment can cause physical, kinetic consequences that no volume of decrypted archived ciphertext can match. The Applied Quantum PQC Migration Framework, operationalizes this insight across its 8-phase methodology, treating the signature gap as a first-order risk rather than a secondary concern – with dedicated sector extensions for OT/CNI environments where the TNFL threat is most acute.

What’s changed today is not the analysis – it’s who is now endorsing it. Google is the first hyperscaler to publicly state that authentication services should be prioritized in PQC migration and recommend that other engineering teams follow suit. When a company of Google’s scale and quantum expertise arrives at the same conclusion that independent practitioners have been advocating since 2018, it validates the urgency – and removes the last excuse for organizations still treating digital signature migration as a “Phase 2” problem. It’s Phase 1. It has been Phase 1 for years. Google just said so out loud.

The 2029 Deadline vs. NIST’s 2030/2035 Timeline: What the Gap Reveals

Google’s 2029 target sits in an interesting position relative to the regulatory landscape.

NIST’s transition guidance, outlined in the initial public draft of IR 8547, establishes a phased timeline: deprecate vulnerable algorithms (RSA, ECDSA, EdDSA, DH, ECDH at 112-bit security) after 2030, and disallow them entirely after 2035. The NSA’s CNSA 2.0 advisory aligns with the 2035 outer bound for National Security Systems.

Google is targeting completion a full year before NIST’s deprecation milestone – not the disallowance milestone. This is significant because deprecation, in NIST’s framework, doesn’t mean “prohibited.” It means “being phased out due to security risks.” Organizations can technically still use deprecated algorithms after 2030, just with increasing risk and regulatory friction. Disallowance in 2035 is the hard stop.

By finishing its migration in 2029, Google is making a statement that deprecation-era uncertainty isn’t acceptable for its risk profile. And in doing so, it creates an uncomfortable comparison point for every other large technology company.

Microsoft, for its part, has publicly targeted full PQC transition by 2033 – two years ahead of the 2035 deadline but four years behind Google. Microsoft’s Quantum Safe Program outlines a phased approach: core infrastructure migration starting in 2026, integration into Windows, Azure, and Microsoft 365 beginning in 2027, with early adoption expected by 2029 and full completion by 2033. IBM has focused heavily on tooling – the Quantum Safe Explorer for cryptographic inventory and PQC hardware acceleration in its z16 mainframe – but has not, to our knowledge, published a specific completion target for its own migration.

The comparison isn’t entirely apples-to-apples. Google’s infrastructure is, by design, more centralized and cloud-native than Microsoft’s sprawling ecosystem of operating systems, enterprise software, and legacy compatibility requirements. Google has been architecting for crypto-agility since 2016, when it first began experimenting with post-quantum key exchange in Chrome. Microsoft faces a harder problem: hundreds of millions of Windows installations, Active Directory deployments, and enterprise environments that weren’t designed with algorithm agility in mind.

But that qualification doesn’t diminish the signal value. If anything, it sharpens the question every enterprise should be asking: if Google – with its purpose-built infrastructure and decade-long head start – needs until 2029, what does that imply for organizations that haven’t started?

Android 17 and the Consumer-Facing Front

Alongside the timeline announcement, Google confirmed that Android 17 will integrate PQC digital signature protection using ML-DSA (the NIST-standardized algorithm formerly known as CRYSTALS-Dilithium). With Android’s stable release widely expected in June 2026, this makes Android one of the first major mobile platforms to deploy post-quantum digital signature algorithms in production.

This builds on Google’s earlier PQC deployments across its product stack: hybrid post-quantum key exchange (using ML-KEM, formerly CRYSTALS-Kyber) in Chrome since 2024, PQC-protected internal communications via the ALTS protocol, and PQC signature schemes in public preview within Cloud KMS.

The Android move is particularly notable because it addresses the consumer side of the equation. Most PQC discussions focus on enterprise infrastructure, cloud services, and government systems. But mobile devices are the endpoint through which billions of people interact with the digital economy — conducting financial transactions, authenticating identity, accessing health records, and communicating privately. If those endpoints aren’t quantum-safe, the cryptographic chain is only as strong as its weakest link.

The choice of ML-DSA for Android is also worth noting. This is a signature algorithm, not a key encapsulation mechanism, which reinforces the prioritization shift discussed above. Google isn’t just protecting data in transit on Android devices – it’s protecting the authentication layer that verifies software updates, establishes device identity, and validates application integrity. These are precisely the long-lived, high-leverage signing operations that a resource-constrained CRQC would target first.

The Elephant in the Enterprise: Most Organizations Can’t Do What Google Just Did

There’s an uncomfortable truth embedded in Google’s announcement: 2029 is almost certainly achievable for Google, and almost certainly unachievable for most of the organizations that need to hear this message.

Google has spent a decade building crypto-agility into its infrastructure. It controls its own browser, its own mobile OS, its own cloud platform, its own internal network protocols. It employs some of the world’s leading cryptographers. When Sophie Schmieg writes a blog post about PQC migration, it’s because she and her team have been doing the actual work for years.

The average enterprise? A recent peer-reviewed study published in Computers journal estimated realistic PQC migration timelines at 5–7 years for small enterprises, 8–12 years for medium enterprises, and 12–15+ years for large enterprises. The researchers noted that unlike previous cryptographic transitions (AES, SHA-2, TLS 1.3), PQC migration involves larger parameter sizes, hybrid cryptographic schemes, and unprecedented ecosystem coordination. IoT devices with 10-to-20-year operational lifespans, HSMs that can’t be firmware-upgraded, and embedded systems with 32KB of RAM that can’t accommodate PQC certificate chains – these are the realities that no Google blog post can engineer away.

And the workforce problem is real. The cryptographic engineering community is small. Few professionals have hands-on experience with lattice-based cryptography. The NIST NCCoE’s PQC migration practice guide has drawn over 47 industry collaborators – including AWS, IBM, Microsoft, Cisco, JPMorgan Chase, and PQShield – but even this consortium has found that hybrid PQC deployments roughly halve network throughput in testing, with significant latency increases.

The gap between what Google can do and what the broader ecosystem can do is the central challenge of the PQC transition. Google setting a 2029 target is leadership. But it risks becoming a standard that only a handful of hyperscalers can meet, while the rest of the economy enters the CRQC era with partially migrated or entirely unprotected infrastructure.

What Google’s Timeline Actually Means for Your Migration

If you’re a CISO, a CTO, or a board member reading this, here’s the strategic translation of what happened today.

Google – the company with more visibility into quantum computing progress than almost any private institution – has concluded that completing PQC migration before the end of this decade is the prudent course of action. Not starting migration. Completing it. That assessment should inform your own risk calculus, regardless of whether you can match Google’s pace.

The practical takeaways are specific enough to act on. First, if you haven’t completed a cryptographic inventory, you are already behind. PCI DSS v4.0 has required documented cryptographic inventories since March 2025. This isn’t optional – it’s compliance baseline, and it’s also the prerequisite for everything else.

Second, reassess your signature infrastructure with the same urgency you’ve applied to key exchange. Google’s reprioritization of authentication services is a signal that the industry’s most sophisticated threat modelers have revised their assumptions. Audit your root certificate authorities, code signing pipelines, firmware signing processes, and any system where long-lived signing keys are in use.

Third, engage your vendors now – not after they announce PQC support, but before. Microsoft is targeting 2033 for full transition. Your cloud provider’s timeline is your dependency. If your HSM vendor, PKI provider, or certificate authority hasn’t published a PQC roadmap, that silence is itself a risk factor.

Fourth, plan for hybrid deployments as the realistic transition mechanism. Pure PQC cutover is not feasible for most environments. NIST anticipates hybrid implementations combining classical and PQC algorithms as the bridge. But those hybrid modes carry performance costs – budget for capacity increases and latency testing.

Fifth, recognize that 2035 is the deadline, not the target. NIST’s timeline gives you until 2035 for full disallowance. Google is telling you, through its own actions, that 2029 is the right ambition. The truth for most enterprises is somewhere in between – but the planning, piloting, and procurement decisions that determine where you land must be made now, not in 2030.

For organizations that need a structured starting point, the Applied Quantum PQC Migration Framework provides an open-access, 8-phase lifecycle methodology – from securing executive mandate through sustained crypto-agility – with sector-specific extensions for financial services, telecommunications, OT/critical national infrastructure, and government/defense. Its 90-day quick-start guide is designed for exactly the situation many organizations now find themselves in: knowing they need to move, but not knowing where to begin.

Quantum Upside & Quantum Risk - Handled

My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.

Talk to me Contact Applied Quantum

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven consulting firm empowering organizations to seize quantum opportunities and proactively defend against quantum threats. A former quantum entrepreneur, I’ve previously served as a Fortune Global 500 CISO, CTO, Big 4 partner, and leader at Accenture and IBM. Throughout my career, I’ve specialized in managing emerging tech risks, building and leading innovation labs focused on quantum security, AI security, and cyber-kinetic risks for global corporations, governments, and defense agencies. I regularly share insights on quantum technologies and emerging-tech cybersecurity at PostQuantum.com.
Share via
Copy link
Powered by Social Snap