Post-Quantum

Sovereignty in the Post-Quantum Cryptography (PQC) Era

Introduction

Post-Quantum Cryptography (PQC) is entering the standards stage, with the U.S. National Institute of Standards and Technology (NIST) recently selecting the first quantum-resistant algorithms. However, the future of PQC will not be as straightforward as simply adopting NIST’s choices globally. A strong push for digital sovereignty is emerging around the world, driven by eroding trust in foreign (particularly U.S.) technology. Nations are seeking greater control over their cryptography and cybersecurity, which means the coming quantum-safe world could splinter into multiple standards and approaches rather than one universally adopted suite. The implications of this trend are far-reaching.

NIST’s PQC Standards – A Starting Point, Not the Final Word

The United States has led the early effort in PQC. NIST began a multi-year open competition in 2016 to identify quantum-resistant public-key algorithms. By 2022, NIST announced four primary candidates for standardization – CRYSTALS-Kyber for public-key encryption (a key encapsulation mechanism) and CRYSTALS-Dilithium, FALCON, and SPHINCS+ for digital signatures. These algorithms, now being published as FIPS standards in 2024, mark the first wave of official PQC solutions. Historically, many countries (and companies worldwide) have looked to NIST’s cryptographic standards as a baseline, and it’s expected that NIST’s PQC algorithms will be widely adopted internationally.

However, adoption of NIST standards is not a given in all quarters, especially in an age of heightened geopolitical tech skepticism. Past incidents have undermined global trust in U.S.-led cryptography – for example, the 2013 revelations of a potential backdoor in a NIST-recommended random number generator. Such events reinforced the belief in some nations that blindly trusting foreign-developed crypto could be a security risk. “Historically, China hasn’t trusted the cryptography standards the US puts out and have developed their own. I think this is also true with regards to Russia,” noted Dustin Moody, a NIST mathematician leading the PQC project. In other words, even as NIST’s PQC standards are becoming global defaults, major powers like China and Russia have long signaled intent to diverge when it comes to critical encryption standards.

Furthermore, “digital sovereignty” has become a policy buzzword, especially in the EU and other regions, referring to the desire to control one’s own digital infrastructure and not be overly dependent on foreign technology. Encryption is a core part of this. The EU’s Security Union strategy, for instance, explicitly highlights cryptography as key to resilience and autonomy. The drive for sovereignty means that while NIST’s algorithms are an important starting point, they are unlikely to be the only game in town. Many nations are charting their own paths – whether by creating indigenous post-quantum algorithms, tweaking existing ones, or adding extra layers of security on top of internationally standardized primitives.

China: Pursuing Independent Post-Quantum Standards

China has embarked on an ambitious path toward its own post-quantum cryptographic standards, diverging from the U.S.-led effort in a very explicit way. In February 2025, China’s Institute of Commercial Cryptography Standards (ICCS) – under the State Cryptography Administration – announced a global call for new quantum-resistant algorithms. This initiative invites proposals worldwide for post-quantum algorithms in multiple categories (public-key encryption/key exchange, digital signatures, as well as quantum-resistant hash functions and block ciphers), with the goal of developing Chinese national standards that can withstand quantum attacks. While welcoming international submissions, the effort is clearly geared towards giving China an encryption suite it controls. The ICCS noted it will evaluate proposed algorithms for security, performance, and implementation feasibility, aiming to eventually standardize the best candidates as Chinese national algorithms.

The motivation for China’s independent push is rooted in both security and politics. Experts suggest that Beijing’s move “reflects concerns over potential US intelligence ‘back doors’ in encryption standards”. Chinese authorities remember the lessons of previous cryptographic controversies and likely view even the openly developed NIST PQC algorithms with some suspicion simply because the selection process was U.S.-hosted. By developing its own suite, China minimizes the risk (real or perceived) of hidden vulnerabilities that could be exploited by foreign adversaries. Additionally, it aligns with China’s broader push for technological self-reliance and control over critical infrastructure. Cryptography is seen as too strategic to outsource. As Moody from NIST observed, it isn’t surprising that China is doing its own standards since “in the larger picture… historically, China hasn’t trusted [U.S.] standards”.

It’s worth noting that China hasn’t waited until 2025 to begin exploring PQC. Chinese researchers have been active in post-quantum research for years. In fact, a few Chinese-designed algorithms were submitted to the NIST competition (such as LAC, a lattice-based KEM, and RankSign, a code-based signature), though none ultimately won out in NIST’s final round. Separately, the Chinese Association for Cryptographic Research ran a domestic PQC contest around 2018-2020, reportedly yielding algorithms like LAC (encryption) and Aigis (a signature scheme) as finalists. China has even been piloting some of these homegrown algorithms in certain applications. For example, LAC was considered for early implementations as a potential quantum-safe VPN method. These efforts signal that China is serious about having its own alternatives to Western cryptography. The new 2025 ICCS call builds on that foundation but with a more organized, global scope – perhaps to attract international talent and scrutinize algorithms in a broader forum, all under Chinese leadership.

Another factor is that by defining its own standards, China can mandate their use domestically (for government, military, and even commercial sectors) and possibly in countries within its sphere of influence. This tactic mirrors what China did with classical cryptography: it created indigenous standards like the SM2 elliptic-curve public key algorithm, SM3 hash function, and SM4 block cipher, and required foreign tech products in China to support them. We may see a similar pattern where any company doing business in China must implement Chinese PQC algorithms (once standardized) in their products, either alongside or instead of the NIST ones.

It’s also notable that while China expresses distrust of U.S. standards, other nations worry about Chinese cryptography too. Encryption sovereignty cuts both ways. U.S. and European experts have speculated that China might seek to “integrate its own covert access points into its encryption protocols” – essentially, the mirror image of the concern China has about U.S. backdoors. In an authoritarian context, having national algorithms could facilitate domestic surveillance or control. This is not proven, but it’s a reminder that each nation’s push for “sovereign” algorithms may be viewed with skepticism by others. For China, however, the official stance is that an independent PQC standard is about protecting Chinese communications from foreign espionage and boosting national technological prestige.

Russia: Developing National Quantum-Resistant Algorithms

Like China, Russia has a long history of favoring indigenous cryptography, and this stance continues into the post-quantum era. The Russian Federation did not simply plan to adopt NIST’s PQC choices wholesale; instead, it has been quietly working on its own set of post-quantum standards. Since 2019, Russia’s federal technical standardization body (Rosstandart) has had a dedicated working group (WG 2.5) under Technical Committee 26 focused on post-quantum cryptographic mechanisms. This group, involving leading Russian cryptographers (including those from a company called Kryptonite), has been designing and evaluating algorithms that could replace or complement Russia’s current GOST standards with quantum-resistant equivalents.

In late 2022, the Russian government took a further step by establishing the National Technology Center for Digital Cryptography, under the Ministry of Digital Development and the Academy of Cryptography. This center is orchestrating a formal evaluation process – essentially a Russian PQC competition – to vet candidate algorithms for standardization. The plan involves inviting proposals for algorithms and subjecting them to cryptanalysis, much like NIST did, although details are less publicly transparent. The Center started accepting candidate algorithms via TC26 subgroups, and also commenced its own research projects into PQC. In other words, Russia is constructing a pipeline to produce Russian-standard PQC algorithms in the near future.

Several candidate algorithms for Russian PQC standards have already emerged from domestic research. One notable example is “Шиповник” (Shipovnik) – meaning ‘Rosehip’ in Russian – which is a post-quantum digital signature scheme. Shipovnik was introduced by researchers at Kryptonite as a replacement for the classic GOST 34.10-2018 signature (which is an elliptic-curve signature). It’s described as Russia’s first post-quantum signature scheme, and interestingly, it has been made open-source to encourage analysis.

Another is “Кодиеум” (Codiaeum), a proposed key encapsulation mechanism (KEM) unveiled in 2024 as a potential successor to Diffie-Hellman key exchange. Codiaeum is based on the hardness of decoding random linear codes (specifically built on the syndrome decoding problem for Goppa codes) – an approach similar in spirit to the classic McEliece encryption scheme. By using code-based cryptography, the designers leverage a class of problems believed to resist quantum attacks. These efforts are domestic analogues to the types of algorithms NIST standardized (lattices, hashes, etc.), but designed and vetted by Russian experts to meet Russian requirements.

In addition to Shipovnik, another Russian signature algorithm called “Hypericum” has been mentioned in reports. (The name “Hypericum” is the Latin genus of St. John’s Wort, continuing the trend of plant-themed codenames after Rosehip.) Both Shipovnik and Hypericum are reportedly being considered by the TC26 working group, and both have publicly accessible implementations. The key point is that Russia intends to have its own portfolio of PQC algorithms for signatures and encryption, rather than relying solely on “Western” choices.

As of late 2025, Russia has not formally issued a national PQC standard yet, but the groundwork is laid. We can expect in the next couple of years the approval of one or more algorithms as GOST standards (the Russian government standards for cryptography). Once that happens, any products for the Russian market (especially in government and critical infrastructure) would be required to use those algorithms. During the Cold War and beyond, the Soviet Union/Russia used entirely separate cryptosystems (like the GOST 28147-89 block cipher instead of DES, or GOST 34.10 elliptic curves instead of ECDSA). We appear to be heading for a similar bifurcation in the quantum era: Russian systems will deploy Russian PQC algorithms, aligning with the country’s desire for cryptographic sovereignty and distrust of foreign algorithms.

It’s also worth mentioning that Russia’s pursuit of independent standards is not just about defensive security. Controlling the algorithms in use can also facilitate government access to encrypted data (e.g., through escrow systems or intentional weaknesses known only to the state). While there’s no concrete evidence of backdoors in Russian PQC designs, the closed nature of their standardization process means the international community will scrutinize any Russian algorithm with caution. Nonetheless, from a sovereignty perspective, Moscow gains assurance that no foreign entity can subtly weaken their encryption standards – only they hold the reins.

South Korea: A Parallel PQC Standardization Effort

South Korea, an American ally with a strong domestic tech capacity, provides an interesting example of a country following a sovereign cryptography path even while aligned with the West. Rather than simply waiting for NIST and then adopting those algorithms, South Korea launched its own PQC project in 2021: the Korean Post-Quantum Cryptography (KpqC) competition. This competition was organized by the national intelligence and security agencies (led by the National Intelligence Service, NIS, with support from the National Security Research Institute) as part of a government PQC master plan. The goal was to evaluate and standardize quantum-resistant algorithms specifically for use in Korea’s governmental and industrial systems.

After several rounds of evaluation, South Korea announced the winners of its KpqC competition in January 2025, selecting four algorithms to form the core of its post-quantum standards. These include two for digital signatures and two for public-key encryption/key exchange (KEMs):

  • HAETAE (signature) – A lattice-based digital signature closely related to NIST’s CRYSTALS-Dilithium (referred to as an ML-DSA, module lattice digital signature). HAETAE uses a different technique for some steps (like rejection sampling) which makes it more efficient and yields compact signatures compared to standard Dilithium. The name “Haetae” refers to a mythical creature in Korean culture, underlining its domestic origin despite the algorithmic similarity to Dilithium.
  • AIMer (signature) – A signature scheme based on the “MPC-in-the-Head” paradigm (a post-quantum approach derived from zero-knowledge proofs). AIMer is not lattice-based; it has much larger signature sizes and slower performance relative to Dilithium, but it belongs to a different class of algorithms (related to hash-based or multivariate techniques via MPC concepts). By selecting AIMer, Korea ensured at least one of its signature standards is not dependent on lattice hardness, providing diversity in case lattices are ever compromised.
  • SMAUG-T (encryption) – A KEM algorithm similar to NIST’s CRYSTALS-Kyber (which is a module-LWE lattice system). The reference to “ML-KEM” (module lattice KEM) indicates SMAUG-T is built on the same hard problem as Kyber, with some tweaks. Its security and performance are regarded as on par with Kyber. (“Smaug” evokes the dragon from literature – perhaps signifying strength; the suffix T might indicate a specific version).
  • NTRU+ (encryption) – A variant of the NTRU lattice-based encryption family. NTRU was a finalist in NIST’s process but ultimately not chosen (Kyber was favored). Korea’s NTRU+ likely refines the original NTRU Prime submission, aiming for efficiency or security improvements. By adopting NTRU+, South Korea keeps alive an alternative lattice approach – one that was originally developed by U.S. researchers decades ago but remains outside NIST’s first standards. NTRU+ gives an option in case any issue arises with Kyber-type designs.

South Korea’s selection thus mirrors the NIST list in some ways (a Kyber-like and a Dilithium-like algorithm) but also deliberately adds alternatives (an NTRU encryption and a non-lattice signature). This strategy provides cryptographic redundancy and sovereignty – Korea has its own versions and can pivot among them based on future developments, without being 100% tied to NIST’s exact choices. The KpqC algorithms are set to be implemented in Korean government systems, with a roadmap for transition by 2035 that aligns with U.S. and European timelines for quantum-proofing security.

It remains to be seen how widely these Korean-specific algorithms will be used outside of Korea. They will undergo scrutiny from the global cryptographic community. As one report noted, these schemes would require significant implementation effort for international crypto libraries and hardware to support, since they differ in various details from the NIST standards.

Nevertheless, Korea has signaled that even close U.S. allies value having domestic control over cryptography. Their approach hedges against solely relying on foreign standards, and possibly gives Korean companies a stake in PQC IP (intellectual property) and know-how. It also ensures that if Korean entities are communicating among themselves, they can use nationally vetted algorithms (potentially seen as more trustworthy within the country). The KpqC competition is a microcosm of the sovereignty trend: even within the U.S. security umbrella, states are preparing indigenous solutions for the quantum era.

Europe: Balancing NIST Adoption with Digital Autonomy

Europe presents a nuanced case. European cryptographers were deeply involved in the NIST PQC competition – in fact, many of the winning algorithms have European co-authors or even originate from European research projects. For example, CRYSTALS-Kyber and Dilithium had contributors from Belgium, the UK, Germany, the Netherlands, etc. As a result, the European technical community has a high comfort level with NIST’s chosen algorithms. Many EU countries have indicated they will adopt NIST PQC standards in principle. For instance, in early 2025 Germany’s federal office BSI incorporated NIST’s PQC algorithms into its official guidelines, and Dutch authorities (TNO and others) similarly recommended piloting those algorithms in infrastructures. This suggests a transatlantic alignment – for now, businesses operating across the Atlantic can target a single family of algorithms without breaking national rules, which is good for interoperability among Western nations.

However, the European Union also places a strong emphasis on strategic autonomy, and cryptography is seen as a key area for asserting digital sovereignty. The EU has been crafting a coordinated approach for PQC transition. In 2022, ENISA (the EU cybersecurity agency) and the European Commission recommended that member states begin preparing for PQC. By June 2025, the NIS Cooperation Group (which coordinates EU countries on cybersecurity) adopted a “Coordinated Implementation Roadmap” for PQC. This roadmap calls for all Member States to start migration by 2026 and, notably, suggests that Europe develop EU PQC algorithms and adopt them as standards to be implemented across the EU. In other words, the EU is not content to only use NIST’s standards; it has set a goal to identify or create European-led PQC algorithms by 2026 which could complement or even replace the NIST ones in certain use cases.

The rationale here is partly trust (wanting full insight into algorithms and not relying on a non-EU institution for security) and partly industrial (encouraging European research and possibly IP ownership in the next generation of crypto). EU officials have cited the need for “cross-border interoperability” in a quantum-safe Europe, which implies all members agreeing on standards – potentially standards developed within Europe itself. Funding has been put into PQC research under EU programs (at least €11 million in recent calls), and bodies like ETSI’s Quantum-Safe Cryptography Working Group have been working on practical implementation aspects. We might see, for example, Europe endorsing additional algorithms that NIST did not pick. One candidate could be Classic McEliece (an American-invented code-based encryption scheme that NIST left as an alternate); another could be variants of lattice schemes or entirely different approaches emerging from European research projects.

Several major European countries have already signaled flexibility. France’s ANSSI and Germany’s BSI have stated they will follow NIST’s lead “but not be restricted to it.” They continue to evaluate other candidates and local needs. The Netherlands went as far as issuing guidelines recommending a hybrid key exchange that includes one of NIST’s algorithms plus an alternative like FrodoKEM (a non-NIST lattice scheme) or Classic McEliece to hedge bets. The Dutch approach (via its National Cyber Security Centre) explicitly lists multiple PQC options and advises combining them with classical elliptic-curve Diffie-Hellman in TLS for now. This reflects a cautious stance: because post-quantum algorithms are new, using a diversity of algorithms reduces the chance that a single breakthrough (or flaw or backdoor) will compromise security. Spain’s cybersecurity strategy similarly mentioned both NIST algorithms and an algorithm like FrodoKEM as part of a phased migration plan. These examples show that European agencies are keeping their options open, even as they embrace the NIST standards. Crypto-agility (the ability to swap out algorithms) is heavily emphasized in Europe as a principle.

Another dimension in Europe is the interplay with quantum communications (QKD). The EU is investing in quantum key distribution infrastructure (the EuroQCI project) for certain government communications. While QKD is separate from PQC (it’s hardware-based key exchange using quantum physics), the EU envisions a mix of QKD and PQC for different needs. For most digital systems, PQC is the practical solution, but the existence of QKD efforts underscores Europe’s inclination to have multiple independent security mechanisms (again aligning with sovereignty – not relying on one technology controlled by one party). The EU recommendation explicitly states a combined use of PQC and QKD could be ideal in the long run, with PQC prioritized in the near term and QKD as a long-term addition for certain high-security links.

In summary, Europe is currently walking a line between global compatibility and autonomy. In the immediate timeline, European organizations are adopting NIST-chosen algorithms, ensuring they’re not left behind in the transition. But concurrently, the EU is organizing its member states to potentially uplift European-grown algorithms into the spotlight by mid-decade. If successful, this means in a few years Europe might have a set of “EU PQC” standards in addition to NIST’s (the EU Parliament has even discussed this in the context of digital sovereignty initiatives). For companies, this could mean that complying with EU requirements might involve using a particular set of algorithms beyond the NIST list. It’s a space to watch: the EU’s decisions in 2025-2026 will clarify whether it fully diverges with unique standards or simply adds more options to an internationally common pool. Either way, Europe’s actions underscore that political factors will shape cryptographic choices, not just technical merit.

Other Nations and Sovereign Cryptography Moves

Outside the major players above, many other countries are also grappling with the balance between adopting international standards and asserting their own cryptographic preferences:

United Kingdom, Canada, Australia, Japan and others

These U.S. allies have so far aligned closely with the NIST process. The UK’s National Cyber Security Centre (NCSC), for example, is advising organizations to prepare for PQC using the forthcoming NIST standards, and Canada’s CSE has a roadmap coordinating with NIST’s timeline.

Japan’s cryptography authority (CRYPTREC) has been monitoring NIST and preparing PQC guidelines accordingly. There is no indication that these countries will introduce completely new algorithms of their own; instead, they focus on timely migration and perhaps small tweaks. For instance, Japan’s CRYPTREC in 2023 published detailed PQC transition guidelines, essentially endorsing the NIST choices while ensuring they meet local requirements (such as using Japan’s favorite symmetric primitives where applicable, like the Camellia cipher or similar, in implementations).

These nations prioritize interoperability with the U.S. and each other, so they are less likely to deviate at the algorithm level. That said, each will enforce its certification and evaluation of implementations, potentially leading to differences in approved parameters or usage modes. (E.g., UK might stipulate certain algorithms for government use out of the NIST list, or Australia may require higher security settings for certain data).

India

India has not announced any national PQC standard as of 2025, but it has shown interest in quantum-safe cryptography as part of its National Quantum Mission. Indian experts acknowledge the need to migrate to PQC to protect the country’s cybersecurity as quantum computers advance.

A domestic cybersecurity startup, QNu Labs, has developed a lattice-based PQC algorithm called “Hodos” (pitched as a key exchange mechanism) which is based on NIST’s lattice studies, and reportedly this has been trialed for use by the Indian Army and Navy. This indicates some level of indigenous capability.

However, these efforts are currently in the private sector and academic realm; the Indian government has not issued a distinct standard yet. It’s likely that India will adopt a set of PQC tools that mix global standards with any home-grown solutions that prove reliable, aligning with its broader goal of technological self-reliance. In the absence of a formal standard, Indian organizations will probably implement the NIST algorithms, potentially alongside indigenous solutions like Hodos for specific strategic applications.

Others in Asia and Middle East

Countries like Singapore and Israel are also planning for PQC (Israel, for one, has required financial institutions to start inventorying cryptographic usage and be ready with a PQC transition plan). They mostly reference NIST’s work, but given their advanced tech sectors, they could introduce nuances.

Israel, for example, might incorporate its own cryptographic libraries or prefer algorithms with certain properties (Israel has a history of cryptographic research and might contribute to analysis of PQC).

In the Middle East, nations such as the UAE or Saudi Arabia are investing in quantum tech and could likewise choose to follow either the U.S./European standards or, conceivably, Chinese ones in the future, depending on geopolitical alignments.

BRICS and others

We’ve covered China, Russia, and India from the BRICS bloc. Brazil and South Africa have been quieter on PQC specifics. They will likely follow recommendations from international bodies (Brazil often mirrors U.S. NIST standards due to historical ties in cryptography, and South Africa might follow what the major economies do).

However, the BRICS as a group have shown interest in technological cooperation independent of Western dominance. It’s not inconceivable that in the future there could be a set of BRICS-endorsed cryptographic standards (for example, Russia and China might push their algorithms through BRICS forums). This remains speculative, but it underscores that global standards might fork along political lines.


In summary, many countries are starting with NIST’s algorithms as the default because those are the most vetted globally, but a number of them are evaluating other options, be it for added security or local pride. Even those not developing new algorithms are asserting sovereignty by the way they implement or mandate cryptography. This can mean requiring certain levels of security or certain combinations that align with national policy. We turn next to some of those methods – the technical strategies countries use to tailor or augment cryptography in pursuit of sovereignty.

Techniques for Asserting Cryptographic Sovereignty

Nations have several tools in their toolbox to assert control over cryptographic implementations without necessarily inventing a completely new mathematical scheme from scratch. These approaches ensure that even if an algorithm is internationally known, its usage in a given country aligns with local security policies and trust considerations. Below we outline a few such techniques:

Homegrown Standards (Independent Algorithms)

The most direct approach is what China, Russia, and South Korea are doing – designing and standardizing entirely domestic PQC algorithms. By running their own selection competitions and research programs, these countries ensure the algorithms (and often the reference implementations) are under their oversight.

The advantage is full sovereignty: they are not reliant on foreign decisions for security. The challenge, of course, is that the algorithms need to be sound; this approach demands significant cryptographic expertise and investment, as well as a community of cryptanalysts to vet the new designs. When successful, homegrown standards give a country bargaining power in international crypto discussions and the ability to mandate algorithms that outsiders might have less expertise in attacking.

For example, China’s future PQC standards arising from its ICCS solicitation will likely become mandatory for Chinese government and perhaps businesses, effectively creating a parallel cryptographic ecosystem.

Russia’s GOST PQC algorithms would similarly be required for Russian classified communications. The downside is interoperability issues – systems using different algorithms can’t communicate securely without some bridging mechanism. This is why global firms may need to support multiple sets of algorithms, as we discuss later.

Hybridizing Algorithms (Wrapping foreign algorithms with domestic ones)

One ingenious way to reduce reliance on a single algorithm is to use hybrid cryptographic schemes – combining two different algorithms such that an attacker would need to break both to defeat the security. Typically discussed in the context of combining classical and post-quantum algorithms (to hedge against quantum advances), the same concept can apply to mixing an international PQC algorithm with a domestic algorithm.

In practice, this could mean, for instance, encrypting a message such that it requires both a NIST-standard Kyber key exchange and a national algorithm (say, a Russian code-based KEM or a Chinese lattice variant) to fully obtain the key. The idea is that even if one algorithm has a hidden weakness, the odds that both algorithms are weak (and exploitable by an adversary) are far lower – especially if they are based on independent mathematical problems.

Some Western security agencies already recommend hybrid approaches; for example, Dutch guidelines advise combining traditional ECC with one of several PQC KEMs in TLS handshakes. By the same logic, a country could mandate combining an NIST PQC algorithm with a local algorithm for certain high-security transactions.

This “belt-and-suspenders” approach ensures sovereignty because the country’s own cipher is always in play (so a foreign-introduced backdoor in the international algorithm alone wouldn’t be sufficient to compromise security). Hybrid schemes do carry performance overhead and added implementation complexity, and they require careful standardization to avoid interoperability chaos. Nevertheless, we are likely to see more of this.

In the context of certificates and digital signatures, there are even proposals for composite certificates that can include two signature algorithms (e.g., one could include both an ECDSA signature and a Dilithium signature, or a Dilithium plus a domestic signature) such that a verifier can check both.

Some governments may adopt composite signing to meet multi-standard compliance – for example, a device might produce a composite signature that satisfies both a US rule (via Dilithium) and a Russian rule (via a GOST PQC signature) at once. This is an emerging area of standards (IETF and ISO are working on hybrid/composite cryptography definitions) aimed at easing multi-algorithm support.

Custom Parameters and Implementations

Not all cryptographic divergence happens at the algorithm level; sometimes it’s about the choices made within an algorithm’s use. Many PQC algorithms have adjustable parameters (key sizes, etc.) that affect security and performance. A country might require using a higher security parameter set than the default recommended by NIST if they have a lower risk tolerance.

For instance, NIST selected Kyber-768 (which is roughly at “security level 3” in their criteria) for general use, but the Netherlands’ NCSC advised using Kyber-1024 (security level 5) for long-term confidentiality needs. That is a local policy choice to err on the side of caution (Kyber-1024 has larger keys but presumably stronger security margin).

Similarly, a nation’s standards body might approve only certain parameter sets of an algorithm – e.g., disallowing smaller keys even if they are part of an international standard.

Another form of customization is swapping out internal components of algorithms for national ones. For example, many PQC schemes use standard hash functions (SHA-3, etc.) or random oracles internally. A country could mandate using its own approved hash function inside an algorithm – say, using Russia’s GOST Streebog hash in a hash-based signature, or China’s SM3 hash instead of SHA256 in a lattice scheme. This doesn’t change the high-level algorithm, but it asserts sovereignty at the implementation level and ensures the algorithm relies on primitives that the nation’s experts trust and have evaluated. (It also means any potential weakness in the hash, if introduced, is under the nation’s control rather than a foreign one.)

We’ve seen precedents: when adopting RSA and ECC in the past, Russia and China often redefined them to use domestic padding schemes, domestic hashing, etc. We can expect similar with PQC. One concrete example: China’s cryptography law requires use of approved domestic ciphers for certain sectors – if NIST’s algorithms are used, they might be encapsulated in a protocol that adds an extra Chinese crypto layer or uses Chinese crypto for symmetric encryption of the payload. Thus, even if the core PQC math is foreign, the overall system remains in line with domestic standards.

Layered (Dual) Encryption in Sensitive Contexts

For the most sensitive communications (military, intelligence), some countries might not trust any single algorithm or implementation. They can employ layered encryption, essentially encrypting data twice (or even more times) with independent algorithms and keys. This concept has been used historically (for instance, in certain NATO environments, highly classified data was encrypted with two different national algorithms in succession, so that compromise of one alone wouldn’t expose the message).

In a PQC context, a state could mandate that, say, a VPN connection between government sites be established using two tunnels: one secured with a globally recognized PQC algorithm and a second secured with a nation’s own algorithm. If one scheme fails (or is secretly penetrated), the other layer still protects the data. The Netherlands’ guidelines mention “protocol stacking” as a possible approach – e.g., running an inner protocol that provides quantum-safe encryption nested inside an outer protocol that also uses standard encryption. This is essentially a form of double encryption.

Another emerging idea is to combine different post-quantum approaches, like using both PQC and QKD together (one to distribute keys, the other to encrypt data), which the EU has floated for future-proof security. While QKD requires special hardware and is limited in scope, a purely software-based approach to layering could be two PQC KEMs used in tandem (one after the other) to derive keys. The cost is increased latency and bandwidth usage (since PQC keys and ciphertexts are large), but for critical applications, nations may accept that cost.

Layering also provides a way for a country to insert a classified encryption layer on top of public standards. For example, a country could have a secret algorithm only used by its intelligence agency, which wraps around the communication already encrypted by a public PQC algorithm. Outsiders would see the outer layer (maybe standard) but not be able to penetrate the inner layer without the secret keys.

Overall, multi-layer encryption is the ultimate fail-safe approach when trust is low – it’s literally defense in depth at the cryptographic level.


Each of these techniques – independent algorithms, hybrids, custom tuning, and multi-layer use – represents a way to reconcile national security interests with the global nature of technology. They are not mutually exclusive; a country could pursue all of them in different domains. The trend reflects a broader truth: encryption is not purely a technical matter, but a strategic one. Governments will choose options that ensure they are not helpless if a foreign-controlled standard is later found vulnerable or if geopolitical circumstances restrict access to certain technologies.

A Fractured Global Crypto Landscape

The consequence of these sovereignty-driven approaches is that the world is poised to enter the post-quantum age with multiple coexisting standards for encryption, where you stand may determine what algorithms you use. It is an “east-west split” in quantum security standards. Broadly speaking, the U.S. and its allies (the “West”) are coalescing around the NIST PQC algorithms, while China, Russia, and some others (“East”) are exploring alternative or additional algorithms that may not interoperate easily with the Western standards.

Concrete examples of this divide are already visible. China, as noted, is trialing algorithms like LAC and Aigis that were developed domestically, and with the ICCS open call, it will likely add even more algorithms to that list. Russia is developing systems around Shipovnik (rosehip) signatures and other native schemes. South Korea is officially backing SMAUG, NTRU+, HAETAE, AIMer as its toolkit.

On the other hand, the U.S., UK, EU (for now), Canada, Japan, Australia, etc., are primarily standardizing CRYSTALS-Kyber, Dilithium, Falcon, HQC (a recent addition by NIST as a backup KEM) and so on. Germany and the Netherlands’ adoption of NIST choices in 2024-25 was a milestone showing that at least among NATO and EU partners there’s alignment. For now.

The result is likely a familiar scenario: analogous to how we had multiple encoding or networking standards in the past, we’ll have a patchwork of cryptography standards. Technical merit alone won’t erase divergence. Even if one algorithm is superior, nations might stick to their own for independence. We can foresee that devices and software in China will use Chinese PQC by default (perhaps with an option for international algorithms for external compatibility), Russian systems will use Russian algorithms, and Western systems will use NIST algorithms – and each will view the others’ cryptography with a degree of mistrust. The divergence might also fall along industry lines: for example, global financial networks could end up needing to accommodate both Western and Eastern cryptographic suites, given banks in, say, Europe and the U.S. will use one set, while banks in China or Russia use another set.

From a geopolitical standpoint, encryption fragmentation reflects deeper strategic rifts. Trust in foreign tech is low, and encryption is intimately tied to national security (after all, whoever controls the cryptography can potentially access or deny access to critical information). Unfortunately, this divide means interoperability challenges. If a Chinese company and a European company want to communicate securely, they might have to negotiate which cryptographic standard to use – or resort to a cumbersome solution of translating between algorithms via a trusted gateway. We could see something like “dual-algorithm” protocols emerging: e.g., a secure email that carries both a Western and an Eastern cipher version of the message to ensure the recipient can read it. Some global standards bodies (like the ISO/IEC JTC 1) might try to standardize a larger set of PQC algorithms including both NIST and non-NIST options to facilitate a more unified ecosystem, but whether consensus can be reached is uncertain.

For businesses and organizations, the multi-algorithm future is almost a given. A company with operations in both the EU/US and China will face compliance requirements to use different encryption in each locale, meaning their products and infrastructure must be agile. Already experts are urging planning for such scenarios, noting that companies with footprints in both spheres need to be ready to support more than one PQC standard simultaneously. This is a significant shift from the last few decades where, despite some outliers, a handful of algorithms (like RSA, ECC, AES) were globally dominant.

The fragmentation also introduces an interesting security wildcard: having diverse algorithms globally might actually reduce systemic risk (since a break in one algorithm wouldn’t expose everybody, only those who use it), but it increases complexity and may create new security holes in the seams between systems. Attackers could potentially target the weakest link or the translation layers between different crypto regimes.

Challenges and Opportunities

Moving into a post-quantum world with divergent cryptographic standards will present challenges on multiple levels, but it may also spur innovation and collaboration in unexpected ways. Here are some key points to consider for governments, companies, and the tech community:

Interoperability and Integration

One of the most immediate challenges is making systems with different cryptographic choices communicate. Global companies might need to implement multiple cryptographic suites in their products. For instance, a messaging app might include support for NIST PQC algorithms as well as Chinese algorithms, toggling depending on the user’s region or negotiation during handshake.

Supporting many algorithms means heavier and more complex software (more code, larger libraries or hardware IP cores), which can introduce compatibility bugs or security flaws. Integration points like VPN gateways or secure email hubs might need special “translation” capabilities – essentially decrypting with one standard and re-encrypting with another in a controlled, secure environment. This is somewhat analogous to language translation, and it raises its own security issues (the data is decrypted at the gateway). If not handled carefully, those gateways could become targets or weak points.

Furthermore, different standards might have different certificate formats or key lengths; an X.509 certificate carrying a Dilithium public key is very different from one carrying a Chinese lattice public key. Bridging these in a trust chain is non-trivial and may require new standards for composite/hybrid certificates as mentioned earlier.

Increased Complexity and Costs

Embracing crypto-agility to this extent (supporting multiple PQC options) comes at a cost. There is an engineering burden: developers and security teams must be familiar with many algorithms and their proper implementation. Testing and validation efforts multiply since each algorithm needs its own test vectors and security evaluation. There may also be performance costs – if a system has to, for example, perform two kinds of handshakes or include longer keys to satisfy the highest common denominator, it could slow things down. Devices with constrained resources (IoT sensors, smart cards, etc.) will struggle to accommodate bloated cryptographic libraries. This could delay the deployment of PQC in such devices if they need to carry extra baggage for sovereignty reasons. Financially, companies might need to pay for multiple sets of hardware IP or certifications (e.g., getting a FIPS validation for NIST algorithms and a separate certification for another country’s algorithms).

On the flip side, companies that become proficient in multi-standard support could find a competitive advantage, as they can serve more markets.

Compliance and Regulatory Uncertainty

The sovereign crypto trend means regulatory requirements may change with geopolitical shifts. A multinational bank might find that to operate in the EU they must use algorithms X and Y, but to also do business in China they must use Z – and these mandates could arise with relatively short deadlines if, say, a security scare hastens a switch. Keeping up with these mandates will be a challenge for compliance departments.

We might also see scenarios where international bodies try to broker alignment – for example, perhaps in some sectors there could be a bilateral agreement to recognize each other’s algorithms to ease trade. However, given the trust deficit, that seems distant currently. It’s more likely that organizations will have to maintain parallel cryptographic infrastructures for different regions, and ensure data is encrypted appropriately depending on where it flows.

This could even affect cloud computing: cloud providers might have to offer separate encryption options in different regional data centers.

Security Positives

A diverse algorithm environment isn’t all bad news. It can provide resilience. If a flaw is found in one algorithm (say a new mathematical attack on lattices), not all systems worldwide will be broken – those using other algorithms remain secure. This heterogeneity can act as a hedge against the unknown. It also pressures each algorithm designer community to up their game, knowing alternatives exist.

Moreover, the work done by one country’s researchers can benefit others. For instance, if Chinese or Russian cryptographers propose strong algorithms, those might eventually be adopted internationally. NIST’s Moody even said NIST would monitor China’s efforts and wouldn’t rule out incorporating a strong Chinese-developed algorithm if it offers a real improvement. In fact, NIST plans to standardize a few more algorithms in a subsequent round, and one of the candidates under consideration (HQC, a French-led code-based KEM) was announced as an additional standard by NIST, showing that bench of viable algorithms is growing.

Perhaps in a decade, the “second generation” of PQC standards globally will include contributions from multiple nations’ competitions, converging the best of each. In that optimistic scenario, sovereignty efforts now could broaden the portfolio of well-vetted algorithms that everyone can choose from.

New Collaboration and Alliances

Cryptography might also become an area of international collaboration among like-minded states. We might see, for example, the European Union and allies like Japan or Canada working together on research for alternatives to NIST’s picks (indeed, many such researchers were already on the NIST submissions). Likewise, China and Russia could share results of their separate efforts to strengthen their hand collectively against U.S. standards.

This could lead to regional blocks of cryptographic standards – perhaps an “EU PQC” that is used across Europe and maybe adopted by some neighboring countries in Africa or the Middle East that align with the EU’s regulatory orbit. Similarly, a “BRICS PQC” set might emerge. This is speculative, but it follows the pattern of technology spheres of influence. Companies and products might then advertise multi-standard compliance as a feature: e.g., a secure communication device that supports both EU-approved and China-approved algorithms, configurable per deployment.

Long-Term Evolution

In the long run, if diplomatic relations improve or if one set of algorithms proves indisputably superior, there could be a convergence. Remember that in classical crypto, despite early divergence (e.g., USSR’s GOST vs. Western AES), eventually AES became an ISO standard adopted even by Russia (as an option) and GOST was also standardized internationally. The underlying math doesn’t know political boundaries, so it’s possible that after the initial jostling, a core global suite will re-emerge. But that may take many years, and in the meantime, planning for divergence is prudent.

Conclusion

The advent of powerful quantum computers poses a threat that demands a cryptographic response; on that, the world agrees. But which response to deploy is where consensus frays, revealing underlying geopolitical fault lines. The transition to post-quantum cryptography is not only a technical upgrade but also a test of global cooperation and trust. As we’ve explored, the landscape is fragmenting: NIST’s PQC standards are a critical foundation and will likely secure a large portion of digital infrastructure worldwide, yet they are not going unchallenged or unmodified in many regions.

For a broad policy and technology audience, the key takeaway is that planning for a quantum-safe future must account for this fragmented reality. Organizations should not assume a single universal standard will prevail everywhere. Instead, they should build crypto-agility into their systems – the ability to switch algorithms or run multiple in parallel – because interoperability will be an active concern.

International cooperation (through bodies like ISO, ITU, or bilateral agreements) may soften the edges of divergence and prevent completely isolated “crypto islands,” but given the strategic importance of encryption, nations will likely maintain some proprietary stance. We may see an uneasy equilibrium where a handful of different PQC standards dominate different spheres of influence.

Marin

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven consulting firm empowering organizations to seize quantum opportunities and proactively defend against quantum threats. A former quantum entrepreneur, I’ve previously served as a Fortune Global 500 CISO, CTO, Big 4 partner, and leader at Accenture and IBM. Throughout my career, I’ve specialized in managing emerging tech risks, building and leading innovation labs focused on quantum security, AI security, and cyber-kinetic risks for global corporations, governments, and defense agencies. I regularly share insights on quantum technologies and emerging-tech cybersecurity at PostQuantum.com.
Share via
Copy link
Powered by Social Snap