Post-Quantum

Breaking RSA Encryption: Quantum Hype Meets Reality (2022–2025)


In late 2022, a remarkable claim jolted the cryptography and quantum computing communities. A team of 24 Chinese researchers quietly posted a preprint on arXiv claiming they had factored a 48-bit RSA-style integer using a 10-qubit quantum computer. This was no mere 15 or 21 (the trivial benchmarks often cited in early quantum experiments) – it was a 48-bit semiprime (a product of two primes), orders of magnitude larger than any number previously factored with quantum help. The paper tantalizingly suggested that their hybrid quantum-classical algorithm could scale up to break RSA-2048 encryption with just 372 qubits, far fewer than the millions of qubits Shor’s Algorithm would naively require. If true, it meant that a sufficiently advanced quantum computer already on the horizon might crack the cryptographic keys underpinning much of the internet’s security. IBM’s 433-qubit Osprey processor, unveiled in 2022, for instance, already exceeds the qubit count the researchers estimated for cracking RSA-2048. Suddenly, what had long been a distant hypothetical threat – quantum codebreaking – felt as though it might be imminent.

But in the tech world, extraordinary claims demand extraordinary evidence. Did this 48-bit quantum factorization breakthrough really herald the end of RSA encryption? Over the next two years, from December 2022 through early 2025, the initial euphoria gave way to a sober reality check. Researchers around the globe scrambled to understand, replicate, and critique the new approach. Competing teams tried alternative quantum-classical hybrids – from quantum annealers to other variational algorithms – chasing ever-larger factorizations. Bold claims and counterclaims flew, including viral headlines that erroneously implied “RSA-2048 has been broken”, which experts were quick to bat down. Meanwhile, the cryptography community kept a wary eye on lattice-based algorithms – the mathematical cousin leveraged in this new quantum factoring attempt – because those same lattices form the backbone of emerging post-quantum cryptography intended to replace RSA. Three years on, it’s time to take stock of what really happened in this feverish period. Has anyone actually come close to breaking RSA-2048? Or did the much-hyped 48-bit quantum factorization turn out to be more of a quantum mirage than a sign of RSA’s imminent doom? Let’s dive into the developments, from peer-reviewed verifications and failed replications to experimental demos and the true state-of-the-art as of 2025.

The 48-Bit Quantum Factorization Claim

On December 23, 2022, the paper titled “Factoring integers with sublinear resources on a superconducting quantum processor” appeared on arXiv. In it, Bao Yan and colleagues described a novel hybrid algorithm that combined classical lattice-based techniques with a quantum optimization procedure. Instead of using Shor’s famous quantum factoring algorithm from 1994 (which in theory could factor large numbers but in practice needs an unthinkably large, error-corrected quantum computer), this new approach repurposed a much-discussed lattice algorithm by mathematician Claus Schnorr. Schnorr’s algorithm is a classical method that attempts to reduce factoring to a lattice problem – essentially a high-dimensional maze where the solution corresponds to the prime factors. By itself, Schnorr’s method had never managed to factor large RSA numbers (it “falls apart at larger sizes,” as one analysis noted ). But the twist in Yan et al.’s approach was to offload the hard part of Schnorr’s algorithm to a quantum routine known as QAOA (Quantum Approximate Optimization Algorithm).

In simple terms, the algorithm worked like this: First, use Schnorr’s lattice method to set up a problem of finding a certain short vector in a lattice (related to the secret factors of the RSA number). This is akin to finding a needle in a mathematical haystack – classically very tough. Then, instead of purely classical brute force, employ a quantum optimizer (QAOA) to search for the needle more efficiently. QAOA is a variational algorithm that runs on today’s noisy quantum machines; it tries to gradually “guess” better and better solutions to an optimization problem by tweaking quantum parameters. By plugging QAOA into the lattice setup, the authors hoped the quantum computer could zero in on the correct combination of values (what they called a “smooth relation pair”) that would reveal the factors of the large number.

Crucially, the resource count for this method was drastically lower than Shor’s algorithm. The authors claimed the number of qubits required grows only sublinearly with the size of the integer – on the order of $$(\log N) / (\log\log N)$$ qubits for an $$N$$-bit number. For RSA-2048 (which has a key length of 2048 bits), this formula suggested only around 372 qubits would be needed. Compare that to standard estimates for running Shor’s algorithm on RSA-2048, which call for millions of physical qubits when error-correction overhead is included. In other words, Yan et al. were promising a quantum shortcut that sidestepped the need for a full-blown fault-tolerant quantum computer. Even more attention-grabbing, they didn’t just theorize the algorithm – they ran it on a 10-qubit superconducting quantum processor and reportedly factored integers up to 48 bits in length. This made 48 bits the largest integer ever factored (even partially) with quantum hardware at the time, surpassing previous demonstrations that had topped out at factoring small numbers like 21 or 35 (using earlier rudimentary quantum experiments). The 48-bit number in question was a miniature RSA-style semiprime (roughly 15 decimal digits long), and the hybrid quantum-classical run successfully found its two prime factors.

The news, once noticed, spread fast. On January 3, 2023, renowned security expert Bruce Schneier blogged about the paper, calling it “something to take seriously” – albeit with caution – and summarizing the method for a broad audience. The next day, The Financial Times ran a story titled “Chinese researchers claim to find way to break encryption using quantum computers,” further fanning the flames. After all, if RSA-2048 could indeed be cracked with only hundreds of qubits, and companies like IBM already had chips of that scale, the implication was staggering. As Schneier noted, IBM’s 433-qubit Osprey and other quantum processors in that range meant the hardware hurdle might already be met. Were we about to witness RSA-2048, the cornerstone of internet security, fall decades earlier than expected?

Hype, Hope, and Harsh Reality

The frenzy of excitement was almost immediately tempered by voices of skepticism – and for good reason. On January 4, 2023, the very same day the FT article ran, Scott Aaronson, a leading quantum computing theorist, penned a blunt rebuttal on his blog. His post, memorably titled “Cargo Cult Quantum Factoring,” opened with a three-word review directed at the Chinese paper: “No. Just No.” Aaronson’s colorful dismissal echoed what many experts were quietly thinking: Schnorr ≠ Shor. In other words, no matter how similar their names, one shouldn’t confuse Schnorr’s lattice-based trick (now paired with QAOA) with Shor’s proven quantum algorithm. The former had never factored anything of cryptographic significance, and simply adding a quantum optimization heuristic – one that “has not yet been convincingly argued to yield any speedup for any problem whatsoever” – was viewed as unlikely to change that fundamental limitation.

Indeed, buried in the conclusion of the Chinese team’s own paper was a telling caveat: “It should be pointed out that the quantum speedup of the algorithm is unclear due to the ambiguous convergence of QAOA.” In plainer language, even the authors acknowledged they hadn’t proven that their quantum component actually accelerates the lattice-based factoring process for larger numbers. It might just be that their approach worked for 48-bit examples due to clever classical preprocessing – or even a bit of luck – but would fail to offer any advantage at larger scales. Aaronson and others seized on this, arguing that without a clear quantum speedup, one was essentially left with Schnorr’s algorithm running on classical resources in the background. And if Schnorr’s method could “destroy RSA,” it would have done so already on a classical computer. The consensus among skeptics was that a quantum heuristic, no matter how fancy, cannot rescue a fundamentally unsound factoring approach.

Over the course of 2023, this skepticism was put to the test. Could anyone else replicate or extend the 48-bit factoring feat? A number of research groups took up that challenge, and the results poured cold water on some of the bolder claims. In July 2023, two Google researchers, Tanuj Khattar and Noureldin Yosri, released a pointed follow-up study titled “A comment on ‘Factoring integers with sublinear resources on a superconducting quantum processor’.” This wasn’t just an opinion piece – Khattar and Yosri built an open-source implementation of the Chinese team’s hybrid algorithm from scratch and tried scaling it up in simulations. They even assumed a “perfect” quantum optimizer (essentially giving QAOA unrealistically ideal performance) to see how far the method could go in the best-case scenario. The findings were sobering: the algorithm could consistently factor only up to 70-bit integers, and then started failing. At around 80 bits, it could no longer find the needed “relations” to factor the number. In other words, the touted sublinear qubit approach hit a wall well before reaching even 80-bit RSA, let alone 2048-bit RSA. The dramatic claims about breaking RSA-2048 with 372 qubits did “not hold true,” the authors concluded dryly. The implication was clear – the original result did not scale, and what worked for 48 or 50 bits was not working even a few orders of magnitude higher. (For context, a 70-bit semiprime is still astronomically smaller than RSA-2048. RSA-2048 has 617 decimal digits; a 70-bit number has only 21 decimal digits. Classical computers can factor 70-bit numbers in the blink of an eye.)

Other researchers dissected the approach further. It turned out the Chinese team had leaned on a 2017 paper by Schnorr (as well as some later tweaks by Schnorr in 2021) that claimed a faster way to factor via lattices. Those claims had already been controversial, with most experts finding that Schnorr’s method worked only on small toy examples and broke down for realistic sizes. The quantum algorithm (QAOA) was ostensibly used to overcome that breakdown. However, QAOA itself is a heuristic – one with no guaranteed performance and known to sometimes get stuck as problem sizes grow. The phrase “ambiguous convergence” in the Chinese paper was a polite way of saying we’re not really sure if the quantum part will continue to find the needle in the haystack as the haystack gets exponentially bigger. By the end of 2023, the broad sentiment was that the 48-bit demonstration, while technically intriguing, did not foreshadow an immediate quantum leap to factoring big RSA keys. University of Texas professor Aaronson had effectively been vindicated in his initial “No, just no” assessment.

That said, the story doesn’t end with simply debunking the original claim. In the wake of Yan et al.’s paper, a flurry of research activity focused on hybrid quantum-classical factoring methods. If one positive thing came out of the episode, it’s that it spurred new ideas and experiments, even if only to answer the question: “Can we do any better?”

Hybrid Quantum-Classical Approaches: Trials and Tribulations

The Chinese proposal opened a kind of Pandora’s box of alternative factoring schemes, all attempting to reduce the daunting requirements of Shor’s algorithm by using quantum resources in clever, limited ways. One immediate variant came from Narendra Hegade and Enrique Solano, who in January 2023 (just weeks after the original paper) proposed what they called digitized-counterdiabatic quantum factorization (DCQF). This mouthful of a method was another way to use a near-term quantum computer to help solve the lattice problem at the heart of Schnorr’s approach. Rather than QAOA’s adiabatic-esque gradual optimization, DCQF explicitly incorporates counterdiabatic driving – a technique from quantum physics intended to speed up reaching a system’s ground state. In essence, Hegade and Solano claimed their approach could find the required lattice vector (and thus the factors) with higher probability than QAOA for the same problem instances. They reported a six-fold improvement in success probability on small cases, by “retrieving the lowest energy state of the corresponding Hamiltonian” more reliably than QAOA. However, like the original, this too remained a proof-of-concept on tiny numbers – a promising tweak in principle, but unproven to scale. A later comprehensive analysis noted that while DCQF might outperform QAOA on the sub-problem, it was “not enough to materially alter” the broader prospects of Schnorr’s factoring method for large integers. In plainer terms, even a better quantum optimizer doesn’t solve the fundamental issue that Schnorr’s lattice approach itself likely requires exponential time as the semiprime size grows.

Other groups explored swapping out QAOA for different quantum variational algorithms. In late 2024, for example, a team led by Luis Sánchez-Cano investigated using the Variational Quantum Eigensolver (VQE) in place of QAOA for the same lattice-based pipeline. VQE is another popular algorithm for NISQ (noisy intermediate-scale quantum) devices, typically used to find ground states of molecules. The researchers hypothesized that VQE’s shorter quantum circuits might cope with noise better than QAOA’s deeper circuits, even if QAOA might converge faster in an ideal scenario. Their work, titled “Factoring integers via Schnorr’s algorithm assisted with VQE,” indeed found that shallower VQE circuits were less prone to errors on current hardware and could sometimes succeed where QAOA failed on small examples. However, they too encountered the unforgiving reality of scalability: increasing the number of qubits (and thus the lattice dimension) did not guarantee success and often led to the algorithm getting stuck in local minima. They managed to factor some modest 20- to 30-bit numbers with these hybrid methods, but anything much larger remained out of reach without enormous runtime and many repetitions.

Notably, some of the progress in this period didn’t even require a quantum computer at all. In mid-2023, a Japanese research team (Yamaguchi et al.) reported factoring RSA-type composites up to 55 bits using purely classical means – specifically, a classical annealing algorithm inspired by the quantum approach. In other words, they took the same lattice setup from Yan et al.’s method but used classical simulated annealing (a Monte Carlo technique that mimics the cooling of metal to find low-energy states) to do what QAOA was supposed to do. And it worked, at least for 50- to 55-bit numbers. This was a bit of a reality check: it suggested that the success at 48 bits might not have come from any quantum magic at all, but from the clever lattice reductions and perhaps the randomness inherent in the search – something a well-tuned classical algorithm could replicate. The Japanese researchers, along with collaborators like Arata Sato, continued to push lattice-based factoring on classical computers. By September 2024, they published an experimental analysis of lattice methods factoring numbers up to 90 bits (still using classical computing) and measuring the success probabilities and runtime in detail. Their findings reinforced that while lattice approaches are an intriguing alternative path, they scale exponentially with the bit-length in practice, just like all known classical factoring algorithms. In short, lattices didn’t offer a free lunch – at least not with current insights.

Meanwhile, on the experimental front, a few “real” quantum factoring demonstrations took place, though at very small scales, more to prove out technology than to threaten RSA. In March 2025, a team at the Russian Quantum Center and Lebedev Institute demonstrated the hybrid Schnorr-QAOA algorithm on a trapped-ion quantum computer. Using just 6 qubits, they managed to factor the number 1591 into 37 × 43. That might sound unimpressive (after all, 1591 is only 12 bits), but the significance was in using an improved QAOA variant called “fixed-point QAOA” to enhance the success rate of finding the factors. They also showed via simulations that, in principle, their approach could factor a 27-bit number with 10 qubits and a 45-bit number with 15 qubits, given enough runtime. These results didn’t break any size records, but they demonstrated the feasibility of running such hybrid algorithms on actual quantum hardware (trapped-ion qubits, in this case) with some success. It was a step forward in technical capability – if not a leap in breaking larger keys.

And what about those quantum annealers? Companies like D-Wave have long built special-purpose quantum machines that solve optimization problems (via quantum annealing), and factoring can be cast as an optimization problem too. In fact, analog quantum annealers achieved some factoring milestones earlier on. By 2023, the record for a number factored purely by an analog quantum computer stood at a 23-bit integer, done by a D-Wave machine. One method, for example, formulated factoring as a problem of minimizing an energy function (QUBO – quadratic unconstrained binary optimization) and used D-Wave’s 5000+ qubits to find the factors. However, embedding a larger factoring problem into an annealer is extremely challenging – the required number of qubits and connectivity blow up quickly. Researchers at Forschungszentrum Jülich in Germany tested various annealing-based factoring methods and found that while they could outperform random guessing (i.e. they showed some advantageous scaling for small sizes), the success probabilities still dropped exponentially as numbers grew. The largest semiprime they reliably factored with the D-Wave in those tests was in the low 20-bit range. So, despite some optimistic early indications (and even a numerical suggestion that quantum annealing might scale polynomially ), the empirical data so far keeps annealers in the exponential camp too.

Through 2024 and into 2025, each new attempt – be it digital quantum, analog quantum, or hybrid – seemed to reinforce a sobering truth: RSA-2048 wasn’t in imminent danger from quantum attacks. While the research was tremendously valuable for advancing quantum algorithms and understanding the lattice approach, every path to actually factoring big cryptographic numbers hit a dead end at relatively modest sizes. As one review put it, “the methods have not demonstrated prime factors at a challenging scale”. Still, the flurry of work did yield some silver linings. It sharpened the community’s knowledge of lattice-based techniques, which is useful not only for attacking RSA but also for defending against quantum attacks (since lattice problems are central to post-quantum encryption). It also spurred improvements in quantum optimization algorithms that could have applications beyond factoring, such as in chemistry or logistics. In a sense, the race to break RSA became a convenient yardstick to drive quantum algorithm research forward – even if RSA itself remained unbroken.

Lattice Techniques: A Double-Edged Sword

It’s worth reflecting on the curious role of lattices in this saga. The 2022 breakthrough claim and many of the follow-up attempts all hinged on lattice reduction – a cornerstone of certain cryptographic problems. Lattices are geometric structures of points in n-dimensional space, and finding short vectors in a lattice (the Shortest Vector Problem or related Closest Vector Problem) is generally hard. Interestingly, lattice-based problems are exactly what next-generation cryptography is built on. When news broke that lattice methods were being used to attack RSA, it raised eyebrows: after all, NIST’s post-quantum cryptography competition (which concluded in 2022) selected lattice-based schemes to secure our future communications in a quantum world. Did the renewed interest in Schnorr’s lattice factoring algorithm mean lattices weren’t as secure as hoped?

The short answer is no – at least not in the way that affects post-quantum encryption. Schnorr’s approach uses lattices in a very specialized manner, tied to the structure of factoring-specific equations (finding a “smooth” difference of squares that reveals a factor). The fact that researchers could leverage lattice algorithms to factor 50- or 60-bit numbers doesn’t translate to a break of general lattice-based cryptography. In fact, what we saw was more the reverse: insights from cryptography were being used to attack a classical problem (factoring). Historically, factoring and lattice problems have been quite distinct in complexity. The General Number Field Sieve (GNFS) – the best classical algorithm for factoring large numbers – does involve a lattice step (finding a short vector related to a huge matrix), but that step is solved with standard methods like the LLL algorithm (Lenstra–Lenstra–Lovász). Schnorr’s idea was essentially to do a more cunning lattice reduction tailored to factoring. It showed some promise for small cases but never beat GNFS for large ones. What the quantum component added was hope that maybe the “hard part” of Schnorr’s method could be offloaded to a quantum solver.

For lattice-based post-quantum cryptography (PQC) algorithms (like CRYSTALS-Kyber or Dilithium, now being standardized), the security relies on the assumption that no efficient algorithm – classical or quantum – can solve certain lattice problems (like finding very short vectors) in reasonable time. The fact that even with quantum help we struggled to factor numbers via lattices actually reinforces that assumption. After all, factoring an $$N$$-bit RSA number via Yan et al.’s method effectively translates to solving some lattice problems of dimension on the order of $$(\log N)$$ (for example, around 50-dimensional lattices for a 80-bit number, and higher for bigger N). The difficulty the community had in going much beyond 60 or 70 bits indicates that high-dimensional lattices remain a formidable challenge, as expected. So, in a roundabout way, the partial failure of these quantum hybrid factoring attempts is good news for lattice-based crypto. It suggests that the lattices underpinning PQC are not easily compromised by the same techniques – if they were, we would likely have seen a more dramatic success in factoring by now.

That said, the interplay between lattice crypto and factoring research is fascinating. The Chinese team essentially borrowed a post-quantum mindset to attack a pre-quantum cryptosystem. It’s a reminder that ideas in cryptography can often be repurposed in unexpected ways. One researcher’s hard problem can be another researcher’s tool. Moving forward, we may see more such cross-pollination. For instance, improvements in algorithms for the Shortest Vector Problem (SVP) could simultaneously tighten the security estimates for PQC and provide new avenues to attempt factoring, or even attack other systems like discrete log, in hybrid quantum-classical ways. The lattice approach did not unseat RSA in these past few years, but it added a rich new chapter to the story of integer factorization – one that will likely continue to be studied and refined.

RSA-2048: Still Safe for Now

As of April 2025, RSA-2048 remains unbroken – and by all credible accounts, still far beyond the reach of both quantum and classical techniques. The frenzied period following the 48-bit factoring announcement ultimately did not produce any technique that dramatically lowers the resources needed to crack modern RSA keys. The state-of-the-art in demonstrated quantum/hybrid factoring is still factoring numbers with at most a few dozen bits. We went from 15 (factored by Shor’s algorithm on a 7-qubit NMR device back in 2001) to 21 (on an IBM superconducting qubit device in 2012) to 35 (on an ion trap in 2019) – and then to 48 bits (with heavy classical assistance in 2022). That 48-bit mark has not been significantly pushed further by quantum means since. In the classical realm, by contrast, factorization records are measured in hundreds of bits: in 2020, researchers classically factored RSA-250 (a 829-bit number, with 250 decimal digits) using months of supercomputer time. The General Number Field Sieve algorithm remains the champion for large classical factoring, and it too scales sub-exponentially (approximately $$L(N) \sim \exp((64/9)^{1/3} (\log N)^{1/3}(\log\log N)^{2/3})$$) – but for 2048-bit numbers, that’s still infeasible with any existing computing power. Quantum computers offered a theoretical exponential speedup via Shor’s algorithm, but the catch was always the enormous resource requirement to implement Shor’s algorithm for RSA sizes. Those requirements – millions of high-quality qubits operating with low error rates for billions of operations – remain unchanged despite the flurry of research. No shortcut has yet been found that truly sidesteps the need for a large-scale quantum computer.

The hybrid approaches we’ve discussed were all about trading quantum depth (and error correction) for more classical computation. Did they reduce the quantum resources needed? On paper, yes: they targeted factoring with only hundreds of qubits, all while keeping the quantum algorithm shallow enough to run on today’s devices. In practice, however, the burden shifted to the classical side (lattice reductions, huge numbers of runs, etc.), and the overall scaling still ended up effectively exponential. In terms of practical security, none of these approaches have brought RSA-2048 any closer to collapse – it’s still astronomically hard to factor a 2048-bit number with any known method on any existing hardware. The community’s initial skepticism has been largely validated: the 372-qubit RSA-2048 fantasy has not materialized into a real attack.

That being said, the research from 2022–2025 did chart a path forward in one sense: it demonstrated that we should be vigilant and not complacent. Quantum computing is advancing year by year – IBM’s roadmap talks of 1000+ qubit chips in the near future, and beyond that error-corrected devices. If and when truly large quantum computers arrive, we now have a much better idea of what hybrid algorithms might be attempted. For example, if someone eventually builds a quantum computer with, say, 1000 high-quality logical qubits, could they combine Shor’s algorithm with some of these lattice or variational tricks to factor a 2048-bit number? Some theoretical work suggests clever optimizations could cut the qubit count significantly. A 2024 study by Chevignard et al. managed to reduce the qubit requirement for standard Shor’s algorithm by combining it with a classical hash trick – getting it down to roughly $$n/2$$ qubits for an n-bit RSA number. That implies about 1024 logical qubits (and a similarly large circuit depth) to factor RSA-2048 – still an enormous challenge, but a far cry from millions. If you had those 1024 perfect qubits, eight hours on a quantum computer might suffice to break RSA-2048 according to one 2019 estimate. However, we’re not there yet: a thousand logical qubits might mean a million physical qubits when you include error correction. Today’s best quantum processors are still noisy and uncorrected. They’re remarkable feats of engineering, but they cannot run the kinds of deep algorithms needed for factoring large numbers without errors overwhelming the computation.

The cryptographic community is not waiting around to find out the hard way. The past few years saw intense efforts to develop post-quantum encryption standards, precisely because experts know that Shor’s algorithm on a scalable quantum computer would obliterate RSA and elliptic-curve cryptography. Even though that computer doesn’t exist yet, the U.S. NSA and NIST, among others, have been urging a proactive transition. The wild ride of the 48-bit factorization claim actually provided a useful real-world test of our preparedness: it created a brief panic in media headlines and presumably in a few boardrooms, asking “Is the quantum apocalypse upon us?”. The answer was “not yet” – this time. To put it plainly, if you encrypted a message with an RSA-2048 public key today, no one on Earth knows how to factor it with currently available technology, even if they threw every quantum computer and supercomputer we have at the task. But it underscored the importance of moving to quantum-resistant cryptography before a real breakthrough happens. As one tongue-in-cheek observation went: cryptographers now have to worry about both actual quantum computing progress and occasional “quantum hype” false alarms that can sow confusion. Both have to be handled with clear communication and rigorous science.

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven professional services firm dedicated to helping organizations unlock the transformative power of quantum technologies. Alongside leading its specialized service, Secure Quantum (SecureQuantum.com)—focused on quantum resilience and post-quantum cryptography—I also invest in cutting-edge quantum ventures through Quantum.Partners. Currently, I’m completing a PhD in Quantum Computing and authoring an upcoming book “Practical Quantum Resistance” (QuantumResistance.com) while regularly sharing news and insights on quantum computing and quantum security at PostQuantum.com. I’m primarily a cybersecurity and tech risk expert with more than three decades of experience, particularly in critical infrastructure cyber protection. That focus drew me into quantum computing in the early 2000s, and I’ve been captivated by its opportunities and risks ever since. So my experience in quantum tech stretches back decades, having previously founded Boston Photonics and PQ Defense where I engaged in quantum-related R&D well before the field’s mainstream emergence. Today, with quantum computing finally on the horizon, I’ve returned to a 100% focus on quantum technology and its associated risks—drawing on my quantum and AI background, decades of cybersecurity expertise, and experience overseeing major technology transformations—all to help organizations and nations safeguard themselves against quantum threats and capitalize on quantum-driven opportunities.
Share via
Copy link
Powered by Social Snap