Post-Quantum

Quantum Readiness / PQC Migration Is The Largest, Most Complex IT/OT Overhaul Ever – So Why Wait?

Introduction

Imagine an IT/OT project in which every single device, application, and system must be discovered, assessed, and then upgraded or replaced. No exceptions. If you’ve ever tried to run just a truly comprehensive IT/OT asset‑discovery exercise, you already know how hard that first step is; the steps that follow are exponentially harder.

And when I say every single device, I mean it. Not just the obvious servers, laptops, and smartphones, but the smart power strip behind a rack in your data‑center, the vehicle‑counting sensor in the parking garage, the thermometer in the CEO’s aquarium, the smart fragrance dispenser in the bathroom, the smart forklift that just connected to the network in your loading dock… And every application and app that any of these use. If it touches your network, it relies on cryptography, and that cryptography is almost certainly going to be vulnerable to quantum attacks in some way.

Preparing for the quantum era is arguably the largest and most complicated digital infrastructure overhaul in history. Yes, far bigger than Y2K, because back in 1999 we didn’t have millions of network-connected “things” to worry about. Yet despite clear warnings and rapidly approaching milestones, far too many organizations still treat quantum readiness as something to punt into next year – or worse, as a simple one-click software update. It won’t be that simple. Not by a long shot. If you haven’t already started planning your post-quantum migration, you’re not just behind schedule – you may already be late.

The Clock Is Ticking (2030 Is Closer Than You Think)

Time is not on our side. Just last week, the European Commission released a coordinated roadmap urging all member states to transition to quantum-safe encryption on an aggressive timeline. The key milestone: by end of 2030, all high-risk systems – critical infrastructure, telecom, finance, government, etc. – should be secured with post-quantum cryptography. Other nations are following suit. The U.S. government initially aimed for 2035 in its PQC plans, but experts (myself included) have argued that 2035 is too slow given the threat timeline. Encouragingly, countries like the EU and Canada have pulled forward targets for critical systems to around 2030-2031, recognizing that we might see cryptography-breaking quantum computers by then. In fact, research and my own analysis suggest “Q-Day” (the day quantum computers can crack current crypto) could plausibly hit around 2030. That’s less than five years away – essentially tomorrow in enterprise IT planning terms.

Unfortunately, many organizations don’t yet share this urgency. A recent poll by ISACA found 62% of security professionals worry about quantum breaking encryption, yet only 5% say it is a high priority for their organization, and just 5% report having a defined quantum strategy in place. Another study of federal agencies revealed only 8% have fully assessed their cryptographic exposure to quantum risk (and while about half are developing strategies, most still lack a clear roadmap or budget).

This complacency is setting us up for trouble.

No, It’s Not a One-Click Upgrade (and Why That Misconception Is Dangerous)

Talking to industry peers, I often encounter a dangerous misunderstanding: the perception that a PQC upgrade will be straightforward – like applying a routine patch or swapping out an algorithm library and calling it a day. Nothing could be further from the truth. There is no one-button update that makes your organization quantum-safe. In reality, quantum readiness will require years of effort and meticulous planning. We’ll have to approach the transition gradually and by criticality, addressing the highest-risk vulnerabilities first.

Remember the Y2K remediation efforts? That’s the other misconception – I often hear that Q-Day preparation is just like Y2K, and Y2K was not a big deal. Even in Y2K, we had to comb through code and update date formats system by system. Now take that scope and effort and multiply it exponentially. With Y2K, everyone understood the deadline and the fix was relatively clear and contained. This time, the challenge is far more complex and sprawling. We don’t have universal public awareness or a single “flag day” to rally around, and the solution isn’t a simple code tweak – it’s a wholesale modernization of our cryptographic infrastructure. Even identifying all various instances of different cryptographic protocols is exponentially more complex than just searching for Y2K-vulnerable date formats. We have to retrofit security into everything we’ve built over decades, without the certainty of a fixed deadline to work toward.

To make matters worse, adversaries aren’t waiting – they’re already harvesting encrypted data now, hoping to decrypt it later when quantum capability arrives. So even if some experts think strong quantum computers are 10-15 years away, the risk to long-lived sensitive data is immediate. Every extra year of procrastination is another year attackers might be siphoning off encrypted sensitive data, with plans to crack it down the line.

The hard truth is that implementing PQC across an enterprise will likely be the largest and most complex digital transformation your organization has ever undertaken.

How large and complex?

Inventory Everything: You Can’t Secure What You Don’t Know

The first reality check is scope. Before you can even start “fixing” anything, you must identify every place in your environment where cryptography is used. This is not just an abstract best practice – it’s an explicit recommendation from top regulators. Practically all regulators urge organizations to first build detailed cryptographic inventories covering all assets and systems using public-key cryptography. Why such emphasis? Because cryptography is everywhere. It’s in obvious places like VPNs, secure web servers, and databases, but also in places you might overlook: printers, HVAC controls, badge readers, IoT sensors, industrial control systems, smart TVs in conference rooms – you name it. If it connects to your network or holds sensitive data, chances are it uses cryptography somewhere under the hood.

In practice, this inventory task is a monster. For example, in one recent assessment, I helped a client discover over 160 distinct categories of connected “smart” devices in a single office building – not 160 devices, but 160 types of devices, each with its own hardware, software, and firmware that were not under control of IT – i.e. these were devices installed and connected to the network by facilities, engineering, corporate security, and various other groups without even placing them under the control of enterprise IT and cybersecurity. And this was just an office building – imagine an enterprise that also runs factories, hospitals, or utility plants with specialized operational technology (OT) and Internet of Things (IoT) devices.

So the very first step is gaining full visibility into your cryptographic footprint: what algorithms you’re using, where, and for what purpose.

Be warned: building a cryptographic inventory is not trivial. Automated discovery tools can help find certificates and encryption libraries, but no tool will magically find 100% of cryptography in your organization. And you are definitely not going to build a cryptographic bill of materials (CBOM) solely through interviews and spreadsheets, as some organizations attempt to do. Crypto can be deeply embedded in custom applications, hidden in legacy hardware, or hard-coded in outdated firmware. Achieving a truly comprehensive inventory requires a combination of automated scans, manual code reviews, vendor documentation checks, and ongoing updates. It’s painstaking work, but essential. If that sounds exhaustive, that’s because it is.

In my experience, some of the most complex and politically charged projects I’ve led were exactly these “simple” asset discovery initiatives in OT environments. Often, what starts as a basic inventory exercise uncovers all sorts of organizational blind spots, turf wars, and accountability gaps (“Who’s responsible for that legacy control system? Who knew we even had that device connected?”). It can get messy internally. But as exhausting as it is, it’s the only way to know the full scope of what you need to fix.

Layered Cryptography: One Device, Multiple Challenges

Identifying all your cryptographic assets is only the beginning. Next comes understanding what exactly needs to change for each one. Here’s the kicker: almost every system has multiple layers of cryptography that may need attention. A single device or application might use a dozen different cryptographic mechanisms – protocols, key exchanges, digital signatures, certificates, encryption algorithms, you name it. And each of those instances will need to be evaluated for quantum-vulnerability. If it’s using a quantum-vulnerable algorithm (like RSA, ECC, or finite-field Diffie-Hellman), we need a plan to mitigate it in some way. And it’s not as simple as “find-and-replace RSA with PQC.” Different post-quantum algorithms may be required for different purposes, and they come with new performance characteristics that need to be assessed and planned for before the migration.

This is precisely why the EU and other authorities stress detailed crypto inventories and crypto-agility – because the transition will be a massive, granular undertaking. In large environments – say a national banking system or a military network – this means touching an astronomical number of components. It’s mind-numbing to even contemplate. In fact, past cryptographic transitions have taken on the order of 10 years (some are still ongoing today). And that was with far fewer devices and less complexity than we have now.

To make matters worse, transitioning to PQC has been aptly compared to “changing the engines on an airplane in mid-flight.” You have to keep everything running, serving customers and users, while you carefully swap out the cryptographic engines under the hood. There’s no downtime window where the business can just go dark for a few months while we re-tool everything – it all must happen seamlessly, in parallel with live operations.

To manage this, you’ll need to prioritize by risk. Not all cryptographic uses are equal. Some protect very sensitive or long-lived data (think: patient health records, state secrets, critical infrastructure controls), while others might be guarding less critical, short-lived information. A common principle emerging from standards bodies is to tackle high-risk use cases first – systems where a compromise (even years from now) would be catastrophic, or data that needs to remain confidential for a decade or more. Those should be at the top of your PQC migration list now. In practice, that could mean deploying hybrid post-quantum solutions in the next year or two for things like your VPN tunnels and PKI certificates, ensuring that if any data is intercepted, it’s protected by at least one quantum-safe layer.

Lower-risk systems will follow later, but make no mistake – eventually everything that uses cryptography will need to be addressed.

Not All Systems Will Be Simple to Fix (Hello, IoT and OT)

Another sobering reality: a significant chunk of your technology estate cannot be easily upgraded to PQC with just a software patch. Sure, your enterprise servers, operating systems, and mainstream applications – especially those supported by big vendors – will eventually have patches or new versions that support quantum-safe algorithms. Your cloud providers will likely roll out PQC support in their services behind the scenes. Those are (relatively) the easier pieces. The hard parts will be all the legacy, embedded, and constrained devices scattered throughout your environment.

Think about all the IoT gadgets and OT systems running on minimal firmware, many of which haven’t seen a security update in years (if ever). Many run on lightweight processors that already struggle with today’s encryption protocols – let alone the heftier key sizes and computations of many post-quantum algorithms. In some cases, this might force us to replace or significantly upgrade hardware just to handle it. Upgrading software in a data center is one thing; upgrading thousands of tiny IoT devices sprinkled across your facilities is something else entirely.

For these hard-to-upgrade cases, organizations will need to get creative. One approach is to build in crypto-agility wherever possible – essentially designing wrappers or interfaces around systems so that you can swap out cryptographic components without replacing the whole system. If a device’s built-in crypto can’t be changed, maybe you can encapsulate it. For example, you might put a gateway in front of a bunch of legacy IoT sensors. The sensors keep humming along as they are, but the gateway adds a quantum-resistant envelope around their communications. In essence, you decouple the device’s weak crypto from its main function by offloading the crypto to something you can upgrade.

Another near-term strategy is deploying hybrid cryptographic solutions: use both a classical algorithm and a PQC algorithm together, so that even if one eventually fails, the other still provides protection. This is also great for maintaining compatibility during the transition. We’re already seeing vendors offer hybrid certificates and dual-algorithm support in protocols like TLS to facilitate this kind of gradual transition.

In some cases, the reality is you will have to replace devices entirely. Many legacy systems (both hardware and software) simply won’t get a post-quantum firmware update from their manufacturer – the vendor may no longer exist, or the product might not have enough horsepower to implement the new algorithms. Part of your quantum readiness plan should be identifying these dead-ends early: which products in your environment are effectively un-upgradeable? Those will either need to be retired, replaced with quantum-safe alternatives, or isolated in a way that their weaknesses can’t be exploited. None of those options are quick or cheap. They require budget, procurement, testing, and deployment – all of which can take years.

And keep in mind, PQC itself is evolving. NIST may have finalized a first set of standards, but cryptographers continue to scrutinize and test them. We’ve already seen one of the initially promising algorithms (an encryption scheme called SIKE) get completely broken by researchers during the standardization processc. It wouldn’t be surprising if, by 2030, new and improved quantum-resistant algorithms emerge or some current ones face new weaknesses. That uncertainty means we have to build flexibility into our plans. If your cryptography is hard-coded, inflexible, and untracked, then pivoting to new algorithms (whether that’s PQC or something beyond) will be a nightmare. So you’ll have to make sure that wherever cryptography is used, it can be located, updated, and swapped out without breaking everything around it. This could involve refactoring applications to use centralized cryptographic libraries (so you can update algorithms in one place), upgrading middleware and operating systems to support pluggable crypto algorithms, and insisting that new tech purchases are “quantum-ready” (i.e. they can accept new algorithms via firmware or software updates). In short, design for change, because our cryptographic tools will certainly change over the coming years. We won’t have the same multi-decade stability as we used to have.

The Biggest IT Transformation Program Ever? Prepare for a Long Haul

I hope you are now getting the sense that when I say preparing for the quantum threat is probably the most complex IT project you’ll ever undertake, I’m not exaggerating. We’re basically talking about changing the tires on a moving truck, or engines on a flying plane, across the entire digital world. And we have to do it under the pressure of an advancing threat that gives us no definitive deadline but plenty of indicators that sooner is better. It’s also a business transformation, not just a technical one. You’ll need senior leadership buy-in (this effort will span multiple budget cycles and organizational silos), cross-functional teams (IT, security, risk management, procurement, compliance, all have roles to play), and a clear governance structure.

Realistically, executing a full PQC migration will take 5-10 years for most large organizations. Don’t take my word for it – look at history and expert analyses. Major banks started moving off SHA-1 and 1024-bit RSA in the mid-2010s and some are still finishing that transition. Government agencies have struggled with eliminating obsolete crypto for many years (just look at how long it’s taken to enforce TLS 1.2+ or to remove old encryption like 3DES from systems). PQC migration is an order of magnitude more challenging. I’d argue that a thorough, enterprise-wide quantum-safe overhaul will be five (or more) years of hard work, long hours, and inevitable complexities. The European Commission explicitly notes that even starting today, an “orderly transition” by 2035 is ambitious.

Stop Procrastinating – The Quantum Threat Won’t Wait

It’s time for some tough love. To my fellow CIOs, CISOs, and IT leaders: I know you have a dozen other digital transformation projects on your plate and a hundred other fires to fight. But quantum readiness needs to move to the forefront of your strategic priorities – now. Every day of delay is one day less to do an incredibly complex job, and one more day for adversaries to exploit our lack of preparation. We already have the initial tools and standards in hand (NIST has published PQC algorithms, vendors are rolling out support, governments are setting hard deadlines). We know enough to start acting; lack of perfect information is no longer a valid excuse for inaction.

Take it from someone who’s been in the trenches: as just one example – I’ve been advising, on and off, a major telecom on quantum readiness for nearly 10 years now, and they are still nowhere close to the finish line. And that’s an organization that at least acknowledged the problem and started early. Many others haven’t even begun. If you haven’t started your program yet, imagine how much catching up you’ll have to do – possibly in a hurry. I’ve also seen firsthand the consequences of delaying critical upgrades; I’ve led investigations into massive breaches that ultimately boiled down to organizations procrastinating on security updates until it was too late.

Frankly, I’m baffled that so many companies remain in “wait-and-see” mode regarding quantum risks. Perhaps it’s simply comfortable to assume that today’s encryption will remain unbroken for many years, or that when quantum does arrive, we can just deal with it then (or trust our vendors to magically fix it for us). But the harvest-now, decrypt-later dynamic means that data being stolen today could be decrypted in a few years if we don’t start re-encrypting it with quantum-safe methods soon. And what if the optimistic timelines are wrong? What if, say, a breakthrough in 2028 or 2029 yields a capable quantum machine faster than expected? Technological progress isn’t linear or predictable – it can leap. Timelines can compress rapidly and unexpectedly when a breakthrough hits. By the time a definite “Q-day” is upon us, it will be far too late to retrofit an entire global digital infrastructure in a hurry.

Since the work has to be done eventually, ask yourself: why wait, and make the task even harder?

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven professional services firm dedicated to helping organizations unlock the transformative power of quantum technologies. Alongside leading its specialized service, Secure Quantum (SecureQuantum.com)—focused on quantum resilience and post-quantum cryptography—I also invest in cutting-edge quantum ventures through Quantum.Partners. Currently, I’m completing a PhD in Quantum Computing and authoring an upcoming book “Practical Quantum Resistance” (QuantumResistance.com) while regularly sharing news and insights on quantum computing and quantum security at PostQuantum.com. I’m primarily a cybersecurity and tech risk expert with more than three decades of experience, particularly in critical infrastructure cyber protection. That focus drew me into quantum computing in the early 2000s, and I’ve been captivated by its opportunities and risks ever since. So my experience in quantum tech stretches back decades, having previously founded Boston Photonics and PQ Defense where I engaged in quantum-related R&D well before the field’s mainstream emergence. Today, with quantum computing finally on the horizon, I’ve returned to a 100% focus on quantum technology and its associated risks—drawing on my quantum and AI background, decades of cybersecurity expertise, and experience overseeing major technology transformations—all to help organizations and nations safeguard themselves against quantum threats and capitalize on quantum-driven opportunities.
Share via
Copy link
Powered by Social Snap