Cryptographically Relevant Quantum Computers (CRQCs)
Table of Contents
Introduction
Definition of CRQC
A Cryptographically Relevant Quantum Computer (CRQC) is a quantum computing system powerful enough to break modern cryptographic algorithms that secure digital communications. In practical terms, a CRQC is a large-scale, fault-tolerant quantum computer capable of running quantum algorithms (like Shor’s algorithm) to crack the cryptographic schemes (e.g., RSA or ECC) that underlie today’s security. It contrasts with the small, noisy quantum processors we have now in that it can reliably perform long computations needed to attack encryption.
CRQC vs Early-stage (NISQ) Devices
Today’s quantum machines are in the Noisy Intermediate-Scale Quantum (NISQ) era. NISQ devices contain tens to a few hundred qubits and suffer high error rates, preventing sustained calculations. They are not yet fault-tolerant – meaning they cannot correct errors on the fly – so their algorithms must be very short before noise overwhelms the result. A CRQC, by contrast, would have thousands (or more) of effective, error-corrected qubits and low error rates, allowing it to run complex algorithms far beyond NISQ capabilities. Essentially, NISQ machines might demonstrate “quantum supremacy” on niche problems, but they cannot break RSA or other cryptography; a CRQC would be able to, because it operates beyond the NISQ regime.
Why CRQCs Matter for Cybersecurity
Modern cybersecurity fundamentally relies on cryptography – protocols like TLS, VPNs, digital signatures, and secure storage all assume certain mathematical problems are intractable for classical computers. A CRQC would undermine those assumptions by solving those hard problems quickly. For example, factoring a 2048-bit RSA modulus is practically impossible for classical computers (estimated to take longer than the age of the universe), but a sufficiently powerful quantum computer could do it in a feasible time. That means all data protected by RSA or elliptic-curve cryptography (ECC) could be decrypted, and digital signatures forged. In short, a CRQC would shatter the security of most currently deployed cryptographic systems. This looming threat has huge implications: adversaries could retroactively decrypt sensitive communications (the “harvest now, decrypt later” strategy), impersonate identities by forging signatures, and undermine the trust in digital systems. Cybersecurity professionals must understand CRQCs to anticipate these risks and guide the transition to quantum-resistant solutions.
Theoretical Foundations
Quantum Computing Principles
Quantum computers leverage principles of quantum mechanics to process information in ways impossible for classical computers. The basic unit is the qubit (quantum bit), which, unlike a classical bit (0 or 1), can exist in a superposition of 0 and 1. Mathematically, a qubit’s state is described by a vector in a two-dimensional complex Hilbert space: $|\psi\rangle = \alpha|0\rangle + \beta|1\rangle$, where $\alpha$ and $\beta$ are complex amplitudes with $|\alpha|^2+|\beta|^2=1$. This means a qubit can represent 0 and 1 simultaneously, in a weighted combination. When measuring the qubit, we get 0 with probability $|\alpha|^2$ or 1 with probability $|\beta|^2$, but before measurement the state encompasses both possibilities.
Superposition allows a quantum computer with $n$ qubits to represent $2^n$ states at once. Another key principle is entanglement, a correlation between qubits such that the state of one qubit is dependent on the state of another, no matter the distance between them. An example is the Bell state of two qubits: $|\Phi^+\rangle = \frac{1}{\sqrt{2}}(|00\rangle + |11\rangle)$, where measuring one qubit instantly determines the state of the other. Entanglement is essential for quantum algorithms, enabling coordinated operations that have no classical analog. Quantum algorithms manipulate qubits with quantum gates (unitary transformations) that exploit superposition and entanglement to perform computations in parallel over many states, with outcomes that interfere constructively for correct results and destructively for incorrect ones.
Quantum Speedups in Cryptography
Quantum algorithms can dramatically reduce the computational complexity of certain problems that are hard for classical computers. In complexity-theoretic terms, many cryptographic problems lie outside the class of efficiently solvable problems on a classical computer (not in P or not known to be in P), yet fall into BQP (bounded-error quantum polynomial time). The most salient examples for cryptography are:
- Shor’s Algorithm (1994): Peter Shor discovered that a quantum computer can factor large integers in polynomial time. Specifically, factoring an $N$-bit number using Shor’s algorithm has time complexity roughly $O(N^3)$ or better with subsequent optimizations, versus the best-known classical factoring algorithms which are sub-exponential (like the Number Field Sieve taking about $O(\exp(n^{1/3}))$ steps). Shor’s algorithm uses quantum Fourier transform to perform period-finding underlying integer factoring and computing discrete logarithms. It can efficiently find the period of a function $f(x) = a^x \mod N$, which yields the factors of $N$. By solving the integer factorization problem, Shor’s algorithm directly breaks RSA encryption, whose security rests on the difficulty of factoring large semi-prime numbers. Likewise, Shor’s method solves the discrete logarithm problem in finite fields and elliptic curves, which means it can break Diffie–Hellman key exchange, DSA/ECDSA digital signatures, and elliptic-curve cryptography as well. The impact is severe: RSA, Diffie–Hellman, and ECC – widely used in SSL/TLS, VPN, SSH, cryptocurrencies, etc. – would no longer be secure once a CRQC can run Shor’s algorithm on keys of relevant sizes.
- Grover’s Algorithm (1996): Lov Grover developed a quantum search algorithm that provides a quadratic speedup for unstructured search problems. If a classical brute-force attack on a cryptographic key requires trying $N$ possibilities (operations on the order of $N$), Grover’s algorithm can find the solution in roughly $\mathcal{O}(\sqrt{N})$ steps. This does not sound as dramatic as Shor’s exponential speedup, but it has important implications for symmetric cryptography. For example, a 128-bit symmetric key (like in AES-128) has $2^{128}$ possible values; a classical attack needs $2^{128}$ trials, which is infeasible, whereas a quantum attacker running Grover’s algorithm could recover the key in about $2^{64}$ steps. In effect, Grover’s algorithm halves the effective key length of symmetric ciphers. AES-128 would offer only ~64-bit security against a CRQC, which is not considered secure, while AES-256 would be reduced to ~128-bit security (still generally safe). Similarly, Grover’s algorithm can speed up attacks on hash functions (finding preimages or certain collisions) from $2^n$ to $2^{n/2}$ operations. Crucially, Grover’s is a general attack algorithm — it works against any brute-force scenario (password cracking, key search), but it is provably optimal for quantum search, meaning we don’t know of any faster quantum method to attack symmetric cryptography than the quadratic speedup. Thus, symmetric algorithms are weakened but not totally broken by quantum computing: doubling key sizes can compensate for Grover’s algorithm in theory.
Mathematically, Grover’s algorithm can be understood as iterative amplitude amplification. It starts by preparing a superposition of all $N$ possible inputs, then uses an oracle function that marks the correct solution, followed by a diffusion (inversion-about-average) operation to amplify the probability amplitude of marked states. After about $\pi/4 \sqrt{N}$ iterations, measuring the quantum state yields the solution with high probability. The quadratic speedup stems from these amplitudes growing linearly with the number of iterations, versus a classical linear search which grows probability linearly in N attempts.
Impact on RSA/ECC vs Symmetric Crypto
In summary, Shor’s algorithm offers exponential speedup on problems that underlie most public-key cryptosystems (factoring and discrete log), rendering them insecure when a CRQC is available. Grover’s algorithm offers a polynomial (quadratic) speedup on tasks relevant to symmetric cryptography and hashing, effectively reducing security levels but not completely compromising them if measures (like larger keys) are taken. For context, RSA-2048 (considered 112-bit classical security) would be trivial for a sufficiently large quantum computer (since Shor scales polynomially), whereas AES-128 (also ~112-bit security against brute force) would still require on the order of $2^{64}$ quantum operations, which is heavy but potentially reachable in the future. This difference is why the advent of CRQCs primarily threatens public-key cryptography and why we focus on replacing those algorithms, while symmetric cryptography needs relatively minor adjustments (like moving to AES-256, SHA-384/512, etc.) instead of outright replacement.
Quantum Computational Power and Cryptanalysis
Achieving Quantum Speedup Beyond NISQ
Current NISQ devices cannot execute the long, complex circuits required for cryptanalytic algorithms on meaningful key sizes. A CRQC, however, would implement fault-tolerant quantum computing, using quantum error correction to stabilize computations over millions or billions of quantum gate operations. The quantum speedups described (Shor, Grover) only manifest when the algorithm can run on a large input (large number to factor, large key space to search). For RSA-2048, for instance, Shor’s algorithm might require on the order of $10^{12}$ quantum gate operations to find the factors. NISQ machines decohere long before such a sequence can complete. A CRQC achieves the necessary speedup by combining scale and stability: it will have a high qubit count, high-fidelity gates, and an error correction scheme to sustain coherent operation for hours if needed. In practice, this means the machine could run a circuit with, say, $10^{12}$ operations by encoding logical qubits into many physical qubits and continuously correcting errors. This ability to run deep circuits without being ruined by noise is what separates a cryptographically relevant quantum computer from today’s prototypes.
Qubit Counts to Break RSA/ECC
How powerful does a quantum computer need to be to break common cryptosystems? Research has produced resource estimates:
- RSA-2048: Estimates have varied, but recent optimizations greatly reduced the qubit requirements. One 2024 study estimates that approximately 1730 logical qubits (error-corrected qubits) and on the order of $2^{36}$ quantum gate operations could factor a 2048-bit RSA modulus in a single run. This algorithm might need to be run about 40 times on average to succeedeprint.iacr.org. In terms of physical qubits, the overhead for error correction is massive. Using the popular surface code for error correction (with a code distance around 27 for 99.9% gate fidelity), each logical qubit could require on the order of $10^3$ or more physical qubits. Thus, factoring RSA-2048 may require on the order of a few million physical qubits with reasonable error rates. An earlier estimate by Gidney and Ekerå (2019) put the figure at about 20 million physical qubits to factor 2048-bit RSA in about 8 hours. These numbers are far beyond the scale of current devices (which have <1000 physical qubits), highlighting the gap between NISQ and a CRQC.
- Elliptic Curve (ECC-256): Breaking a 256-bit elliptic curve cryptography key (e.g., the curve used in ECDH/ECDSA for 128-bit security) also requires running Shor’s discrete log algorithm on the elliptic curve group. Interestingly, some analyses indicate this might be slightly easier than RSA-2048 in terms of qubits. For example, Häner et al. (2020) estimated roughly $8n + 10.2\log n$ logical qubits for an n-bit elliptic curve discrete log problem. For n = 256, this comes out to on the order of ~2100 logical qubits (which is of the same order as RSA-2048 factoring). In general, breaking commonly used ECC (secp256r1, etc.) would likely require a couple thousand logical qubits and a similar scale of gates as RSA. Thus, RSA-2048 and ECC-256 have roughly comparable difficulty for a quantum computer (both are believed to require a few thousand stable qubits and billions of operations), and both are well out of reach of current machines.
For context, in 2024 we expect the industry to reach 1000 physical qubits, and those are noisy (not error-corrected). Reaching the thousands of logical qubits needed for CRQC will likely require hardware on the order of millions of physical qubits given error-correction overheads – a huge leap from the state of the art.
Error Correction and the Road to Fault Tolerance
The main technical challenge in building a CRQC is quantum error correction (QEC). Physical qubits are highly error-prone (gate error rates ~ $10^{-3}$ to $10^{-4}$ per operation are common). To perform a cryptographically relevant computation, error rates must be suppressed to extremely low levels over the entire algorithm. QEC involves encoding a single logical qubit into many physical qubits such that if some physical qubits get corrupted, the logical information can be recovered. For example, the surface code – a leading QEC code – might require on the order of $d^2$ physical qubits per logical qubit, where d is the code distance. To correct errors with probability ~99%, a distance $d\approx 25-30$ might be needed, meaning around 625-900 physical qubits per logical qubit (and even more when including ancillas for syndrome measurement). The threshold theorem in quantum computing tells us that if physical gate error rates are below a certain threshold (around 1% for many codes), arbitrarily long computations become possible by adding more redundancy.
Transitioning from NISQ to fault-tolerant requires a few major milestones: improving gate fidelities (to reduce physical error rates), scaling up the number of qubits, and demonstrating QEC that actually extends coherence (so far, small logical qubits have been demonstrated and maintained for short periods). As error rates drop and qubit counts rise into the thousands, we may achieve a kind of early fault-tolerant quantum computer which can run medium-scale algorithms reliably. From there, further scale-up and optimization are needed to reach CRQC level algorithms like factoring 2048-bit numbers.
Timeline for CRQC Realization
Predicting when a true CRQC will appear is difficult, as it depends on both engineering progress and scientific breakthroughs. However, informed estimates exist. Surveys of experts suggest that the chances of a CRQC arriving increase over the next two decades. A recent expert survey in 2023 found that while most experts see the probability as very low in the next 5 years, by 15–20 years out a majority believe there’s a significant chance a CRQC will exist. By 30 years out, nearly all experts surveyed believe that quantum attacks will be practical. In concrete terms, some forecasts (e.g., Gartner) predict that by around 2030 quantum computers will start to pose a significant threat to asymmetric cryptography, and by 2033–2035 they could potentially break public-key crypto completely. This is my personal prediction as well.
These timelines are speculative, but they underscore an urgent point: a CRQC might be built within the next two decades, and perhaps even sooner if unforeseen breakthroughs accelerate progress. Governments are acting on the assumption that we might only have a decade (or less) to transition to quantum-safe cryptography, given the high stakes. From a cybersecurity planning perspective, the prudent approach is to expect that by the 2030s a cryptographically relevant quantum computer could indeed emerge, and to prepare accordingly – especially since data encrypted today (and stolen via “store now, decrypt later” tactics) could be compromised once that happens.
Cryptographic Implications
A CRQC would undermine the security of many cryptographic protocols in use today. We break down the vulnerable areas and how the industry is responding with quantum-resistant alternatives:
Vulnerable Cryptographic Protocols
Virtually all widely used public-key (asymmetric) cryptographic systems rely on math problems that Shor’s algorithm can solve efficiently. Key examples include:
- Public-Key Encryption and Key Exchange: Systems like RSA encryption, RSA-OAEP, and RSA-based key transport, as well as Diffie–Hellman (DH) and elliptic curve Diffie–Hellman (ECDH) key exchange, would be directly broken. Shor’s algorithm can factor the RSA modulus to recover private keys, and solve the discrete log problem to recover DH/ECDH private keys. This means an eavesdropper with a CRQC could decrypt secure communications by deriving the session keys (for RSA, by decrypting the key exchange; for DH/ECDH, by solving for the shared secret). Protocols like TLS, IPsec, SSH, which commonly use these algorithms for establishing secret keys, would no longer be secure against interception if the attacker has a CRQC. Even data encrypted in the past (and recorded) would be at risk – e.g., a recorded TLS handshake using ECDHE could have its ephemeral secret discovered, thereby decrypting the session.
- Digital Signatures and PKI: Digital signature schemes based on RSA or ECC (e.g., RSA signatures, ECDSA, DSA) would also fail. An adversary could forge signatures by computing the secret signing key from the public key (factor the RSA modulus to get the private exponent, or do discrete log on the EC public key to get the signing key). This breaks the authenticity guarantees of our public key infrastructure: an attacker could impersonate websites (by forging TLS certificates), sign malware as if it were from a trusted vendor (code-signing certificates), or falsify documents and messages that rely on digital signatures for integrity. The supply chain security implications are especially dire – e.g., an attacker could forge a Windows Update package signature if they cracked Microsoft’s code-signing key, enabling broad compromise. Essentially, the trust model of the Internet and software distribution would collapse unless new signature schemes secure against quantum attacks are in place.
- Secure Email and PGP, etc.: Protocols like PGP/GPG or S/MIME use RSA or ECC for key management and signatures; these would likewise be broken, compromising the confidentiality and authenticity of secure email.
- Cryptographic Authentication: Mechanisms such as challenge-response protocols using RSA/ECC signatures, or password-authenticated key exchanges that rely on a Diffie–Hellman exchange, would be susceptible. Even some cryptocurrency systems (many cryptocurrencies use ECDSA for transaction signatures and ECDH for certain key exchanges) would be vulnerable – for instance, an attacker with a CRQC could potentially forge transactions if they obtain the public keys (addresses) of wallets, by deriving the private key via discrete log.
In contrast, most symmetric cryptographic primitives (encryption algorithms like AES, 3DES, and hash functions like SHA-256) are not outright broken by known quantum algorithms, but their security strength is reduced by Grover’s algorithm. For symmetric ciphers, as noted, a brute-force key search is quadratically faster. For hash functions, finding a preimage can be sped up quadratically (though finding collisions may see less improvement; the best known quantum attack finds collisions in $O(2^{n/3})$, a slight improvement over classical $O(2^{n/2})$ for collision search). The practical effect is that symmetric keys should be doubled in length to maintain the same security level against quantum attack. For example, 256-bit keys for symmetric encryption and 512-bit output for hash functions are recommended to retain an adequate security margin in the post-quantum era.
To summarize, traditional public-key cryptography is the Achilles’ heel – RSA, DH, ECC (including the widely used curves like secp256r1 and Curve25519) will no longer be secure once a CRQC exists. Symmetric crypto and hashing will need tweaks (larger keys and outputs), but not a full replacement.
Post-Quantum Cryptography (PQC) – Quantum-Resistant Algorithms
In anticipation of the CRQC threat, researchers have been developing new cryptographic algorithms believed to be secure against quantum attacks. Collectively known as post-quantum cryptography, these are typically based on mathematical problems that (as far as we know) neither classical nor quantum computers can solve efficiently. Some major families of PQC include:
- Lattice-Based Cryptography: Based on problems like Learning With Errors (LWE) or the Shortest Vector Problem in high-dimensional lattices. Lattice problems are believed hard for quantum computers (no analog of Shor’s algorithm is known for them). Many of the leading PQC algorithms are lattice-based. For example, CRYSTALS-Kyber (a key encapsulation mechanism) and CRYSTALS-Dilithium (a digital signature scheme) are lattice-based and have been selected by NIST for standardization. These offer efficient performance and small symmetric-key-equivalent security levels (e.g., Kyber-768 roughly matches AES-192 security). The hardness comes from algebraic problems in polynomial rings or matrices that a quantum computer seems to have no better way to solve than brute force.
- Code-Based Cryptography: Based on error-correcting codes, e.g., the difficulty of decoding random linear codes (the syndrome decoding problem) or specific code constructs like Goppa codes. The classic example is McEliece encryption (using binary Goppa codes), which has stood since the 1970s as unbroken and appears quantum-resistant. Code-based schemes tend to have very large public keys (hundreds of kilobytes) but are very fast in encryption/decryption. NIST’s process included a code-based finalist (Classic McEliece) for encryption.
- Multivariate Quadratic Cryptography: Based on the difficulty of solving large systems of nonlinear (quadratic) equations over finite fields. Schemes like Rainbow (a multivariate signature) showed promise, but many have been broken by classical cryptanalysis during the PQC evaluation process. At present, no multivariate scheme was chosen by NIST in the first round of standards, due to security concerns.
- Hash-Based Signatures: Relying solely on the security of hash functions (which we believe remain strong against quantum attack except for the brute-force search). Hash-based signatures (like XMSS and SPHINCS+) use one-time signature techniques and Merkle trees of hashes. They produce relatively large signatures and have some state-management complexities (for stateful schemes like XMSS), but are very conservative in security (based on well-understood hash properties). NIST has selected SPHINCS+ (stateless hash-based signatures) as an alternate candidate for standardization.
- Isogeny-Based Cryptography: Based on problems related to elliptic curve isogenies (e.g., Supersingular Isogeny Diffie–Hellman, SIDH). These had very small key sizes and were an active research area, but recently some isogeny schemes were broken by innovative classical attacks. One scheme (SIKE) in the NIST process was compromised in 2022 by a sudden classical attack, illustrating the uncertainty in evaluating new math problems. Nonetheless, some isogeny approaches (such as CSIDH) remain topics of study.
- Other approaches: e.g., NTRU (an older lattice-based encryption), FrodoKEM (lattice without structure, hence larger keys), and algebraic code-based hybrids, etc., have been studied. The PQC field underwent a competitive evaluation led by NIST, resulting in a set of recommended algorithms (Kyber, Dilithium, Falcon, SPHINCS+ as of 2022) moving toward standardization.
These quantum-resistant algorithms are designed such that known quantum algorithms (like Shor or Grover) do not undermine them. For instance, Grover’s algorithm still applies generically (meaning a quadratic search speedup could attack any scheme by brute force), but if the key sizes or parameters are chosen appropriately (e.g., using 256-bit keys or equivalent security to withstand Grover’s $\sqrt{N}$ speedup), that generic attack is ineffective. There are no known super-polynomial quantum attacks for these PQC problems, though ongoing research continues to scrutinize them.
Migration Challenges for Enterprises and Governments
Adopting post-quantum cryptography across all systems is a massive challenge. Some key issues include:
- Standards and Interoperability: Until recently, there were no standardized PQC algorithms. NIST’s PQC competition has provided clarity on which algorithms to use, but those now need to be integrated into internet standards (TLS, IPsec, SSH, etc.), communication protocols, hardware security modules, smart cards, and so on. Ensuring different products and systems can talk to each other with the new algorithms requires coordinated updates to protocols and libraries (for example, adding PQC cipher suites in TLS and getting servers and browsers to support them).
- Performance and Resource Overheads: Many PQC algorithms have larger key sizes or signature sizes than their classical counterparts. For example, a Dilithium signature might be a few kilobytes (versus a few tens of bytes for ECDSA), and a Kyber public key is around 800 bytes (versus 32 bytes for an ECC public key). This can impact network bandwidth and storage (imagine TLS certificates bulking up, or the overhead in IoT devices with constrained memory). Verification and key generation might also be slower on limited hardware. Embedded devices, smart cards, and microcontrollers that could handle RSA/ECC might struggle with PQC algorithm requirements without hardware acceleration. Some environments (like RFID tags or very low-power IoT sensors) may find it challenging to implement PQC at all, necessitating innovative solutions or careful parameter choices.
- Legacy Systems and Software Update: Perhaps the biggest hurdle is the sheer number of systems that use cryptography. Enterprise networks, government systems, critical infrastructure, and consumer devices all rely on cryptographic libraries baked into software and hardware. Many of these systems are legacy or hard to update (e.g., SCADA systems, medical devices, older IoT gadgets, or even satellites in orbit). Replacing or upgrading the cryptography in these can be extremely slow and costly. Crypto-agility – designing systems to be able to swap out cryptographic components easily – was not widely practiced in the past. As a result, the migration to PQC could take many years or even decades. NIST has noted that introducing new cryptographic algorithms and getting them widely deployed “requires updates to protocols, schemes, and infrastructures that could take decades to complete”. This long transition period is worrisome, because it might intersect with the arrival of CRQCs, leaving a window where adversaries can exploit weaknesses.
- Trust and Maturity: The new PQC algorithms are relatively young. While they have undergone years of public scrutiny in the standardization process, they haven’t been field-tested for decades like RSA and ECC. Organizations may be cautious about deploying them until they are confident in their security and reliability. There’s also the risk that we discover new classical or quantum attacks on these schemes (as happened with some alternate candidates during the NIST process). Thus, a phased or hybrid approach is often recommended – for instance, using hybrid encryption in TLS that combines a traditional algorithm (like ECDH) with a PQC algorithm (like Kyber) so that even if one breaks, the other still protects the session. Many early adopters (including some browsers and cloud providers) are testing such hybrid modes.
- Infrastructure and Ecosystem: Beyond just algorithms, certificates and PKI infrastructure need to adapt. Certificate authorities will need to start issuing PQC-based certificates. New root certificates using PQC algorithms might need to be distributed. Support for PQC in TLS can’t just be at the protocol level; it needs OS support, library support (OpenSSL, etc.), and so on. Hardware appliances (firewalls, VPN concentrators, etc.) that accelerate cryptography may need upgrades to handle PQC. All of this requires significant planning and investment.
Given these challenges, governments and large enterprises are already urging action. The U.S. government, for example, has mandated agencies to begin inventorying their cryptographic systems and have a plan for transitioning to PQC within the next few years. The U.K.’s National Cyber Security Centre similarly has guidance urging organizations to prepare for migration and adopt crypto-agile practices. The key advice for now is don’t wait for the quantum computer to arrive; start the transition before CRQCs are here, because by the time one is announced, it may be too late to protect decades’ worth of sensitive data that’s already been encrypted with vulnerable algorithms.
Broader Cybersecurity Implications
Beyond directly breaking encryption algorithms, the advent of powerful quantum computers (and related quantum technologies) has other security implications that cybersecurity professionals should consider.
Quantum-Enhanced Attacks Beyond Cryptanalysis
Adversaries might leverage quantum computing in more indirect ways to compromise systems:
- Side-Channel Attacks with Quantum Tech: Side-channel attacks exploit information leaked by the physical implementation of a system (such as timing, power consumption, or electromagnetic emissions) to derive secret data like cryptographic keys. Quantum technology could supercharge these attacks. For instance, quantum sensors (devices exploiting quantum effects to achieve extreme sensitivity) can potentially measure electromagnetic fields or other signals with far greater precision than classical sensors. A research program in Germany is exploring how quantum magnetometers or other sensors could be used to sniff out cryptographic keys by detecting minute variations in a chip’s electromagnetic leakage – variations too subtle for conventional equipment. This could make side-channel attacks easier or more effective, even against devices previously considered safe. Similarly, quantum computers might help in analyzing side-channel data: e.g., using quantum algorithms to sift through massive traces of power usage data to find the needle-in-a-haystack correlation revealing a key.
- Accelerating Brute-Force and Password Cracking: While Grover’s algorithm impacts symmetric key search, a quantum computer could also be used to crack passwords or cryptographic keys via brute force in ways not directly covered by Grover (especially if some structure or heuristic can be exploited). If attackers obtain password hashes, a quantum computer might attempt guesses faster by effectively testing many possibilities in superposition. Passwords, however, are usually low-entropy, so even classical cracking often succeeds; quantum just tilts the scales further in favor of attackers. This means current password-based systems will be even more vulnerable unless augmented by quantum-resistant schemes or higher-entropy secrets.
- Algorithmic Attacks and Cryptanalysis: Quantum computers may aid in other forms of cryptanalysis or problem-solving relevant to security. For instance, some cryptographic reductions or puzzle-solving can be framed as optimization or SAT problems, which might be sped up by quantum algorithms like Grover or quantum annealing techniques. An example is analysis of certain block cipher structures or hash functions – while no dramatic quantum attacks are known beyond Grover’s generic approach, it’s an area to monitor. There’s also research into quantum algorithms for solving systems of linear equations (HHL algorithm) or machine learning. A quantum adversary might use these to break codes or CAPTCHA-like challenges, or to better model and exploit security systems.
In summary, quantum computers expand the toolkit for attackers beyond just breaking encryption – they can become powerful data analysis and optimization engines that could find vulnerabilities or secret information faster than classical means.
Quantum-Aided Security Tools
On the defensive side, quantum technologies also offer new tools to strengthen security:
- Quantum Key Distribution (QKD): QKD uses quantum properties (often of photons) to enable two parties to generate a shared random secret key with the guarantee of security based on physics. The most famous QKD protocol, BB84, involves sending photons in certain polarization states; if an eavesdropper tries to intercept or measure them, the quantum state collapses and the legitimate parties can detect the intrusion due to increased error rates. The end result is a provably secure key exchange (under assumptions of quantum physics correctness) — the security does not rely on computational difficulty, so it cannot be broken by any future computer, quantum or otherwise. QKD has advanced from labs to real-world deployments: China has built an extensive QKD fiber network (over 2,000 km) linking multiple cities, and even integrated satellite QKD links to extend the network to a total span of 4,600 km. European countries and others have their own QKD testbeds as well. While QKD is not a panacea (it requires specialized hardware and doesn’t replace encryption itself, only the key exchange part), it is a intriguing tool for high-security environments. For cybersecurity pros, it’s important to know that QKD can complement post-quantum cryptography by providing security even against unlimited computational power, albeit with practical limitations (distance, needing line-of-sight or fiber, vulnerability of the quantum channels to jamming, etc.).
- Quantum Random Number Generators (QRNGs): The quality of cryptographic keys and secrets depends on true randomness. Classical random number generators can be weak or biased (and have in the past been backdoored). QRNGs use quantum phenomena – fundamentally random processes like radioactive decay or photon detection at a beam-splitter – to produce genuine random bits. Since quantum outcomes (like the decay of an atom or a quantum measurement result) are believed to be inherently unpredictable, a QRNG can provide provably unbiased random numbers. These can strengthen cryptographic keys and reduce the risk of predictable keys that undermine security. Already, some security hardware devices incorporate quantum entropy sources (e.g., Intel has included a quantum random number generator on some chips using quantum tunneling noise). Wider adoption of QRNGs will improve the baseline security of cryptographic systems.
- Quantum Computing for Defense: Just as attackers might use quantum computers for malicious purposes, defenders can use quantum computers or simulators to strengthen systems. For example, quantum computers could help model complex systems for security analysis, optimize network configurations for security, or run quantum simulations to test the strength of cryptographic algorithms (including PQC algorithms) against quantum techniques. Quantum algorithms might also assist in intrusion detection by quickly analyzing patterns or anomalies in large datasets of network traffic, thanks to their ability to handle certain high-dimensional computations efficiently. This is a more speculative area, but security professionals should keep an eye on how quantum computing capabilities might be integrated into security operations (e.g., rapid cryptanalysis of malware communication protocols, advanced threat modeling, etc.).
Supply Chain Security Risks in the Quantum Era
The development and eventual deployment of quantum computing technology introduces supply chain considerations on multiple levels:
- Cryptographic Supply Chain: Modern software and hardware systems rely on a chain of cryptographic trust – from the developers’ code signing, to update distribution, to secure boot of devices. As mentioned, if an adversary has a CRQC and our cryptography isn’t upgraded in time, they could insert themselves into this supply chain. For instance, they could intercept software updates by breaking the digital signature or encryption and inject malware (like a more powerful version of the SolarWinds hack, but not needing insider access – just breaking keys). Code-signing certificates that ensure firmware authenticity could be forged. Thus, a quantum adversary can facilitate widespread supply chain attacks by exploiting now-weak cryptography. Organizations must ensure that the cryptography securing their supply chain (code signing, update delivery, PKI for device authentication) is made quantum-safe well before CRQCs arrive.
- Hardware Supply Chain for Quantum Tech: As nations and companies race to build quantum computers, the hardware components (specialized lasers, cryogenic systems, microwave electronics, etc.) become critical assets. There is a risk of sabotage or tampering in the quantum hardware supply chain. For example, if a country is procuring quantum systems or components from abroad, could there be backdoors or hidden vulnerabilities inserted that would allow an adversary to disrupt the operation of the quantum computer or steal information from it? This might sound far-fetched, but it’s a valid concern in a world where tech supply chain integrity is already a big issue (as seen with classical ICT equipment). Ensuring trusted suppliers and possibly developing domestic quantum hardware capabilities is on the radar of governments to avoid dependency on potentially compromised components.
- Geopolitical Imbalance: Quantum computing progress is an area of intense international competition. If one country achieves a CRQC first and keeps it secret, it could silently exploit it to decrypt others’ communications or gain strategic advantage (like codebreaking in WWII, but on a larger scale). This creates a national security imperative: no one wants to be the last to upgrade their cryptography, or worse, completely oblivious while their secrets are being read. This has led to initiatives to secure government and defense communications ahead of time. There’s also the scenario of “quantum monopoly” – if only a few entities control CRQCs (due to the immense cost and expertise required), others might have to rely on them as a service. That raises supply chain trust questions: would you trust a cloud quantum computing service with your encrypted data, if that service is run by a foreign entity or even a third-party company? The potential for misuse or unauthorized decryption is a new risk dimension.
- Disruption of Supply Chain Operations: A subtle point raised by some analysts is that quantum computers could also optimize and disrupt logistics and supply chain management itself (not just security). For example, a competitor with a quantum computer might solve complex optimization problems (like vehicle routing or scheduling) much faster, potentially outcompeting others (this touches on economic security). Alternatively, as noted in one analysis, the sheer speed of quantum data processing could expose timing vulnerabilities in systems that assume certain tasks (like cracking a code or authenticating) take a long time. If those assumptions break, operational processes might be thrown off. Organizations should consider quantum readiness not only for security, but also for maintaining business continuity if quantum computing changes the competitive or threat landscape.
In essence, the rise of CRQCs requires a holistic security strategy: upgrading cryptography, yes, but also re-evaluating the trust model of every link in the technology supply chain. Stronger physical security, validation of hardware, continuous monitoring for unexpected decryption or impersonation attacks, and international cooperation on quantum security standards will all play a role. As one industry analysis bluntly put it: quantum computing can make supply chain attacks far easier by breaking the encryption that traditionally safeguarded those channels. Therefore, ensuring quantum-safe practices in procurement and deployment is an emerging responsibility for cybersecurity teams.
Industry and Research Efforts
The pursuit of a cryptographically relevant quantum computer involves a broad ecosystem of tech companies, academic labs, and government programs. Here we outline the major players, technologies, and initiatives driving progress:
Major Industry Players
- IBM: IBM has been a front-runner in quantum computing, developing superconducting qubit processors and making them available via the IBM Quantum Experience cloud. IBM’s roadmap is ambitious: they unveiled a 127-qubit chip (“Eagle”) in 2021, a 433-qubit chip (“Osprey”) in 2022, and have announced a planned 1121-qubit chip (“Condor”) by 2024/25, among future devices. These processors are still NISQ, but IBM is also investing heavily in quantum error correction research. They introduced the concept of quantum volume as a metric to measure progress in quantum capability (taking into account qubit count, fidelity, connectivity). IBM aims to build scalable systems and has a robust software stack (Qiskit) and a broad partnership network. If IBM’s roadmap holds, they could cross the thousand-qubit milestone (physical qubits) around now, and then focus on integrating error correction to move toward a small fault-tolerant quantum computer later this decade. IBM is also involved in fundamental research into new qubit designs and materials to improve coherence times and gate fidelity.
- Google (Alphabet): Google’s Quantum AI division made headlines in 2019 by achieving “quantum supremacy” with a 53-qubit superconducting processor named Sycamore, performing a contrived random circuit sampling task faster than a classical supercomputer could. While that task had no direct practical use, it was a major scientific milestone. Google’s focus is similarly on superconducting qubits and error-correcting them via surface codes. They have demonstrated basic error correction building blocks and are exploring modular approaches (e.g., connecting multiple chips). Google’s researchers have also published resource estimates for breaking cryptography (like the referenced Gidney & Ekerå work on RSA-2048) and thus are keenly aware of what it will take to build a CRQC. Google’s roadmap foresees building a useful error-corrected quantum computer by roughly 2029. They continue to refine qubit fidelity; for example, improving two-qubit gate errors and crosstalk is a key focus to make the surface code practical. Google’s open-source software framework, Cirq, helps researchers prototype algorithms on current devices.
- Microsoft: Microsoft’s approach has been distinct in that they have pursued topological quantum computing. The idea is to create qubits that are inherently protected from noise by encoding information in special quasiparticles (Majorana zero modes) that exhibit topological stability. This approach, if it works, could dramatically reduce the overhead for error correction. However, it has proven extremely challenging – for years, Microsoft struggled to even produce the underlying physical effect reliably. In 2022, Microsoft researchers published evidence that they had observed the necessary topological phase, renewing hope in this strategy. Alongside this, Microsoft offers Azure Quantum, a cloud platform that provides access to various quantum hardware (from partners like IonQ, Quantinuum, etc.) and also supports quantum-inspired optimization on classical hardware. Microsoft also develops the Q# programming language. While Microsoft doesn’t yet have a large-scale quantum computer of its own in operation, its research investments (led by notable figures like Matthias Troyer and Michael Freedman) and engineering toward a topological qubit could be a dark horse that, if successful, leapfrogs others in scalability.
- Quantum Startups (IonQ, Quantinuum, Rigetti, Xanadu, etc.): A number of specialized companies are building quantum hardware:
- IonQ (a University of Maryland spin-off) and Quantinuum (formed by a merger of Honeywell Quantum Solutions and Cambridge Quantum) are focused on trapped ion quantum computers. Trapped ions use electromagnetic traps to hold charged atoms (like Ytterbium), using laser pulses to perform gates. They have the advantage of very high-fidelity gates and long qubit coherence times, and all-to-all connectivity between qubits; the challenge is slower gate speeds and harder parallelization. IonQ has reported devices with up to 32 ions and has a roadmap aiming for triple-digit ion counts with error correction in coming years. Quantinuum (Honeywell) similarly has devices in the 10s of qubits with world-record fidelities and is exploring scaling via multiple trap zones and shuttling ions between zones.
- Rigetti Computing and several academic labs (UC Berkeley, etc.) work on superconducting qubits similar to IBM/Google. Rigetti offers cloud access to its quantum processors (in the 80-qubit range as of mid-2020s) and focuses on multi-chip integration to scale beyond a single die’s qubit count.
- Xanadu (Toronto-based) and PsiQuantum are champions of photonic quantum computing. Photonic approaches use particles of light (photons) as qubits, which have the advantage that they travel fast at room temperature and can leverage existing fiber optic and silicon photonics technology. Xanadu’s 216-mode photonic processor demonstrated a form of quantum computational advantage using Gaussian boson sampling in 2022. PsiQuantum, on the other hand, is very ambitiously aiming for a million-photon qubit fault-tolerant quantum computer using photonics and has partnered with GlobalFoundries to use semiconductor fabs for its optical circuits. Photonic quantum computing is attractive for its potential to scale via integrated photonic circuits, but performing high-fidelity two-photon gates deterministically is an open problem (many schemes rely on probabilistic gates or measurement-induced gates).
- Neutral atom arrays (offered by companies like Pasqal and QuEra): These use neutral atoms (not ions) trapped in optical tweezers, which can be moved and arranged in 2D arrays. Quantum gates are done via exciting atoms to Rydberg states or via exchanging interactions. These systems have produced 100+ atom demonstrations and are promising for scalability (since atoms are identical and you can trap hundreds by duplicating laser systems).
Each of these technological approaches (superconducting, trapped ion, photonic, neutral atoms, topological, etc.) has pros and cons, and it’s not yet clear which will achieve the fault-tolerance threshold at scale first. It’s possible that a combination or a hybrid approach could be used in the future.
Academic Consortia and Research
Universities and research institutes worldwide are heavily involved, often collaborating with industry. Notably:
- Universities like MIT, Caltech, Oxford, Delft, University of Maryland, and many others have dedicated quantum research groups tackling everything from materials science for better qubits to quantum algorithms.
- There are cross-institution projects, for example the NSF Quantum Leap challenges in the U.S., or the EU Quantum Flagship (discussed below).
- Open source software and community contributions also play a role (e.g., software frameworks like Qiskit, Cirq, Forest, PennyLane for hybrid quantum-classical machine learning, etc., are developed with academic input).
National Initiatives
- United States (National Quantum Initiative): The U.S. has a coordinated program via the National Quantum Initiative Act of 2018, which authorized a substantial federal investment (over $1.2 billion over 5 years across agencies) in quantum information science. This includes funding for Department of Energy Quantum Research Centers, National Science Foundation quantum centers, and NIST research. The U.S. also passed the CHIPS and Science Act in 2022 which further supports quantum R&D. In parallel, agencies like DARPA and IARPA fund quantum computing research with security implications (e.g., IARPA’s programs on quantum-resistant security and quantum algorithms). The aim is to ensure the U.S. remains at the forefront of quantum tech for both economic and national security reasons. There’s also an emphasis on workforce development, standardization (NIST’s PQC effort is part of this), and industry collaboration through consortia like the Quantum Economic Development Consortium (QED-C).
- China: China has made quantum technology a top strategic priority, investing massive sums (estimated around $10–15 billion in government funding) into quantum research. They have constructed the National Laboratory for Quantum Information Sciences in Hefei with a multi-billion dollar budget. China leads the world in some quantum metrics, especially in quantum communication: as noted, they built a 12,000 km QKD network with satellites and performed landmark experiments like quantum teleportation from space. In quantum computing, Chinese groups have also claimed “quantum advantage” demonstrations – for instance, a team used a photonic circuit to perform boson sampling faster than classical computation (a different approach from Google’s superconducting result). In 2020, a group in USTC (University of Science and Technology of China) demonstrated a 60-qubit superconducting quantum computer called Zuchongzhi, which performed a random circuit sampling task reportedly 10^6 times faster than classical simulation. Chinese researchers are active in superconducting qubits, trapped ions, and photonics. One key difference is that much of China’s effort is state-driven, with less visibility into any classified programs that might exist. The geopolitical implication is a quantum “race” – China’s goal is to not rely on Western tech and to beat the West in quantum capability, given the military and economic edge that could providemerics.org. This has spurred other countries to accelerate their efforts.
- European Union (Quantum Flagship and others): The EU launched the Quantum Technologies Flagship, a 10-year, €1 billion research initiative beginning in 2018. This funds dozens of projects across themes like quantum computing, simulation, communication, and sensing. European researchers have been leaders in quantum theory and experiments (e.g., many pioneering quantum computing proposals came from Europe, and EU-based teams continue advancing ion trap tech, photonic chips, etc.). The Flagship has industry partners and aims to translate research into commercial applications, ensuring Europe remains competitive. Additionally, individual European countries have significant programs: Germany, France, the U.K., the Netherlands, Switzerland and others have all committed national funding in the hundreds of millions of euros each for quantum R&D. For example, Germany’s program is over €2 billion, France’s “Quantum Plan” is €1.8B, and the U.K. has invested heavily since its 2014 national quantum programme (the U.K. is focusing on four hubs: computing, sensing, imaging, and communications). This collaborative approach in Europe is yielding startup companies and research breakthroughs (for instance, France’s Pasqal in neutral atoms, Finland’s IQM in superconducting qubits, etc.).
- Others: Canada has a strong quantum research community (Perimeter Institute, IQC at University of Waterloo, D-Wave Systems in Vancouver for quantum annealing). Australia has been notable particularly in silicon spin qubits (UNSW and Silicon Quantum Computing Pty). Japan has long-running efforts (in NTT, universities, and a national Quantum Leap flagship program). Russia had quantum research as well, though less is publicly known recently. India announced a National Quantum Mission in 2023 with a significant budget. In summary, many nations view quantum tech as the next frontier akin to the space race, and are investing accordingly to avoid falling behind.
All these efforts contribute to a rapid pace of progress. In just the last few years, we went from ~50 qubit devices to ~hundreds of qubits, and demonstrations of quantum advantage in specialized tasks. But we are not yet at a CRQC. The race now is to improve qubit quality (to reduce overhead for error correction) and quantity (to have enough qubits to do something classically intractable yet useful). The industry is optimistic that practical quantum computers will arrive in this decade, though exactly when a cryptographically relevant one will appear is uncertain. Notably, even before a full CRQC, smaller quantum computers could be leveraged by nation-states to attack slightly smaller cryptographic keys or weakened algorithms – for instance, maybe breaking 1024-bit RSA (which some legacy systems still use) might happen earlier. This is another reason industry and government are pushing to phase out even medium-size keys and move to PQC sooner rather than later.
Conclusion
Cryptographically Relevant Quantum Computers (CRQCs) represent a seismic shift on the horizon of cybersecurity. In this article, we’ve seen that CRQCs are defined by their ability to execute quantum algorithms (like Shor’s and Grover’s) at a scale that breaks the cryptographic primitives we rely on daily. While still likely years (if not a decade or more) away, their eventual arrival is not a question of “if” but “when,” according to most experts. For cybersecurity professionals, the implications of CRQCs are profound:
- Summary of Importance: CRQCs will be game-changers – they will render insecure the public-key cryptosystems that undergird secure communications, financial transactions, digital identity, and more. A sufficiently advanced quantum computer can crack RSA, Diffie–Hellman, and elliptic curve cryptography, undermining the confidentiality and integrity of sensitive data worldwide. Even protocols and systems that use symmetric cryptography will see their security margin reduced by quantum attacks. Essentially, a large chunk of the cybersecurity toolkit must be rethought in the face of quantum capabilities. This is not just an academic concern but a practical one with high stakes for governments (e.g., diplomatic and military communications security), enterprises (intellectual property, customer data protection), and individuals (privacy of communications, integrity of digital signatures).
- Urgency of Post-Quantum Preparation: Given the potential for “Y2Q” (the day a quantum computer breaks our crypto) to arrive within the next 10-20 years, there is an urgency to transition to post-quantum cryptography. The process of standardizing PQC is already underway (NIST’s selections in 2022 are a major milestone), but deploying them globally is a massive effort. It’s important to start now: inventory your organization’s cryptographic usage, prioritize systems that need updates, and implement crypto agility so that algorithms can be switched out easily in the future. Remember that data that is sensitive for the long term (health records, state secrets, etc.) is already vulnerable to being harvested now and decrypted later if adversaries suspect it’s worth it. The transition period is a race against time – one where defenders must act before attackers have the quantum tool in hand. Governments are treating this with the seriousness of a looming cybersecurity crisis, issuing directives and funding migration efforts. Private sector security leaders should do the same: incorporate PQC into upgrade cycles, test new algorithms in pilot programs, and stay updated on standards. The cost of proactive migration is far less than the cost of a breach or compromise due to broken encryption.
- Broader Cyber Strategy: Preparing for CRQCs isn’t just about changing algorithms. It’s about a holistic strengthening of security posture:
- Emphasize defense in depth (so that even if cryptography is one layer of defense, other measures like access control, monitoring, and data segmentation limit the damage if that layer is pierced).
- Keep an eye on quantum advancements not just in computing but sensing – understand that new attack vectors might emerge (like quantum-enhanced side-channel attacks) and be ready to mitigate them.
- Leverage quantum tech for good where possible: consider quantum random number generators for better keys, and in the future, potentially quantum key distribution for highest-security links.
- Promote crypto-agility in design: since we may need to swap out algorithms again (who knows, maybe in a few decades if someone invents an even more powerful paradigm or if some PQC algorithms are found weak), we should architect systems to be flexible.
- Collaborate and share information: the threat is global, so threat intel about quantum-related attacks or progress should be shared across the community. Organizations like ETSI, IETF, and others have working groups for PQC and quantum-safe practices – engagement with these can help keep professionals informed and influence solutions that work in practice.
- Open Research Questions and Future Developments: There are still many unknowns in the quantum security equation:
- On the technical side, will there be further breakthroughs that reduce the resource requirements for a CRQC (for example, new quantum algorithms, or better error correction codes, or physical qubit improvements)? Research like the 2024 factoring paper that cut qubit counts for RSA shows that continuous improvement is happening. We have to plan under uncertainty, potentially worst-case assumptions.
- Will any new quantum algorithms emerge that threaten even the post-quantum schemes? So far, none are known – lattice and code problems have withstood decades of scrutiny – but cryptographers remain vigilant.
- How will implementation challenges be addressed? PQC algorithms need side-channel-resistant implementations; otherwise, they might be broken not by math, but by leakage. It’s an open area of research to develop efficient, secure implementations of PQC in hardware and software.
- Quantum computing might also unlock new positive security techniques (like more powerful forms of encryption such as homomorphic encryption improvements via quantum, or new quantum protocols for authentication). The interplay of quantum tech and classical cybersecurity will continue to evolve.
- On the policy side, how will governments respond if a nation achieves a CRQC? There may be calls for export controls on quantum tech, international agreements akin to arms control (since a CRQC can be seen as a strategic weapon in cyberspace). Ensuring a quantum balance of power might become a concern to prevent destabilizing scenarios.
In conclusion, cybersecurity professionals should view CRQCs not with panic but with proactive resolve. The situation is often compared to a ticking clock – we know a major cryptographic upheaval is coming, and we have a window of time to get ready. By staying educated (as you are doing by reading detailed analyses like this), influencing your organization’s roadmap for crypto upgrades, and contributing to the community effort on quantum-ready security, you can help ensure that when quantum computers arrive, our networks and data remain safe. The cryptographic community has faced big transitions before (like the move from DES to AES, or from SHA-1 to SHA-256, albeit those were smaller in scope), and each time it required coordination and diligence. The quantum transition is larger by far, but not insurmountable.
The key takeaway is one of adaptation: the threat from CRQCs is real and important, but with well-thought-out preparation – embracing post-quantum cryptography, improving overall security hygiene, and leveraging quantum technologies for defense – we can continue to secure cyberspace in the quantum age. The time to start that adaptation is now, before the threat fully materializes. By doing so, we convert the story of CRQCs from one of looming catastrophe to one of resilience and innovation in the face of change.