Quantum Computing

Neven’s Law: The Doubly Exponential Surge of Quantum Computing

Quantum computing is often said to be following a trajectory unlike anything seen before in classical tech. In 2019, Google’s Quantum AI director Hartmut Neven noticed something remarkable: within a matter of months, the computing muscle of Google’s best quantum processors leapt so quickly that classical machines struggled to keep up. This observation gave birth to “Neven’s Law,” a proposed rule of thumb that quantum computing power is advancing at a doubly exponential rate – far outpacing the steady exponential progress of Moore’s Law. In Neven’s words, with double-exponential growth “it looks like nothing is happening, nothing is happening, and then whoops, suddenly you’re in a different world“. Neven’s Law offers a provocative lens on how fast quantum breakthroughs might arrive and what that means for technology and security.

What is Neven’s Law and Where Did It Come From?

Neven’s Law originated from Hartmut Neven’s work at Google’s Quantum Artificial Intelligence lab. After months of rapid improvements to Google’s superconducting quantum chips, Neven noted that their performance relative to classical computers was skyrocketing at a “doubly exponential” pace. The “law” was first mentioned publicly at a Google Quantum Lab symposium in May 2019. Essentially, it’s an extrapolation of an in-house trend: each time the team reduced errors and added a few more qubits, classical computers needed exponentially more resources to keep up. Combine these factors, and quantum machines were gaining on classical ones at an almost unheard-of rate. While Moore’s Law (for classical chips) famously predicts a steady doubling of transistor counts roughly every two years – a simple exponential growth – Neven’s Law suggested something even more dramatic in the quantum realm.

To visualize the difference, consider how exponential vs. doubly exponential growth diverge. In a standard exponential trend, you might see a sequence like 2, 4, 8, 16… each step multiplying by 2. Moore’s Law follows this pattern: computing power doubles periodically, which is impressive in its own right. But a doubly exponential trend grows in powers of powers: 2, 4, 16, 256, 65,536… each jump exploding faster than the last. In practice, Neven was observing that quantum computing power (relative to classical) wasn’t just doubling at regular intervals – it appeared to be squaring itself with each advance. Within the time it takes a classical chip to double its power (~18 months), a quantum processor could quadruple in power. Such rapid acceleration has few precedents in technology; as Neven quipped, progress seemed slow until “suddenly you’re in a different world”. Google’s own hardware provided anecdotal evidence: a task that a laptop could simulate in December 2018 required a powerful desktop by January, and by February 2019 even Google’s servers struggled to keep up . That eye-popping pace led Neven to codify the concept that bears his name.

Why “Doubly Exponential” Is a Big Deal (versus Moore’s Law)

Neven’s Law stands out because it implies faster-than-exponential growth. Moore’s Law has been our yardstick for innovation for decades – it’s the reason your smartphone today is millions of times more powerful than room-sized computers of the 1960s. But Moore’s Law is ultimately linear on a log-scale: a steady, predictable curve. Neven’s doubly exponential curve, if it holds, would be more like a rocket launch. In concrete terms, Moore’s Law means each generation of classical processors is about twice as powerful as the last. Neven’s Law means each new generation of quantum processor outpaces the last by a squaring factor. For example, within roughly the same timeframe that classical chips double, a quantum computer might 4× its abilities. Over a few generations, that gap becomes enormous: after four cycles, an exponential growth might yield a 16× improvement, whereas a doubly exponential growth could yield a 65,536× improvement. It’s easy to see why this got researchers excited – and a little nervous. Doubly exponential growth is so extreme that it’s hard to find real-world analogies. In fact, Quanta Magazine noted that no known technology was definitively following a doubly-exponential curve…until possibly quantum computing.

Why would quantum progress be so explosive? The theory behind Neven’s Law combines two effects. First, quantum computers inherently hold an exponential advantage over classical ones in certain tasks: n qubits can represent $$2^n$$ states, so simulating a quantum system with just 50 or 60 qubits can overwhelm classical supercomputers. Second, hardware improvements were coming fast and furious – Google and others were rapidly increasing qubit counts and lowering error rates (which had been a major limiting factor). Each incremental improvement in qubits or quality made the quantum processor exponentially stronger, and simultaneously made the task of classically simulating that processor exponentially harder. In Neven’s view, these two exponentials multiplied, leading to a double-exponential relative gain.

It’s important to note that Neven’s Law is specifically about quantum vs classical performance. This means it’s describing how quickly quantum machines are catching up to, and then surpassing, the capabilities of classical machines at certain tasks. Once quantum computers clearly overtake classical ones for a given problem, that comparison becomes less relevant – at that point we’d compare quantum progress to earlier quantum machines, not to classical benchmarks. (In other words, after “quantum supremacy” is achieved for some task, future improvements would likely look more like a normal exponential progression of quantum technology itself.) But until that threshold is crossed, Neven’s Law dramatically frames the race: it says the finish line might arrive much sooner than a linear or even exponential extrapolation would suggest.

Implications: Quantum Supremacy, Advantage, and the New Timeline

If Neven’s Law holds true, the timeline for quantum breakthroughs compresses significantly. In 2019, Neven himself hinted that the milestone of quantum supremacy – a quantum computer decisively outperforming the fastest classical supercomputer on a specific task – was imminent. Indeed, later that year Google announced it had achieved just that: their 53-qubit Sycamore processor solved a contrived random-circuit sampling problem in about 200 seconds, a task they estimated would take Summit (the world’s top supercomputer) 10,000 years. In other words, Sycamore did in minutes what a classical machine would need millennia for – roughly a 158-million-fold speedup. This controversial claim (IBM argued the task could be done in days, not millennia, on a classical machine) nevertheless demonstrated that quantum hardware had very suddenly reached an era of supremacy-like capability. It was a vivid example of Neven’s prediction: “quantum supremacy is around the corner” – and then, seemingly overnight, it was here.

Following that 2019 result, the race only intensified. By late 2021, startup QuEra had unveiled a 256-qubit quantum simulator, a fivefold jump in qubit count over Google’s Sycamore in just two years. IBM, meanwhile, broke the 100-qubit barrier with its 127-qubit Eagle processor in 2021, and one year later unveiled a 433-qubit chip (codenamed Osprey). IBM’s engineering roadmap aimed for a 1,121-qubit system (Condor) by 2023, and the company is now eyeing devices with many thousands of qubits through modular linking by the mid-2020s. Other contenders – from IonQ’s trapped ion systems to Xanadu’s photonic processors – also rapidly scaled up the qubit counts or demonstrated new feats. Thanks to these leaps, what researchers call “quantum advantage” – meaning a quantum computer performing a useful, real-world task better than a classical computer can – is no longer a distant dream. In fact, analysts at EY noted in 2022 that if current trends persist, we could start to see “pockets of disruption” from quantum computing in as little as three years.

Beyond that lies the holy grail: fault-tolerant quantum computing – error-corrected, large-scale machines that can tackle any algorithm (like Shor’s algorithm for breaking encryption or complex AI training tasks). Neven’s Law paints an optimistic timeline. If quantum power keeps doubling on top of doublets, what might have been penciled in for 2040 or beyond could hit in the early 2030s.

Of course, it must be said that predicting technology “laws” is tricky. Moore’s Law itself eventually ran into physical limits; doubly exponential growth in quantum computing may likewise plateau due to engineering challenges (scaling qubits is hard, maintaining coherence and low error rates gets exponentially harder with size, etc.). Even Neven acknowledged his law as an empirical observation, not a guarantee carved in stone. Still, the mere possibility of a doubly exponential burst in computing power has far-reaching implications. It means sooner rather than later, society could cross key thresholds: solving currently unsolvable problems in materials science, AI, or cryptography. For planning purposes, many experts now treat the advent of powerful quantum computers as a when, not if, and likely “sooner than expected” scenario. We are effectively preparing for the sudden arrival that Hemingway once described in a different context: progress happens gradually, then suddenly. With Neven’s Law, the “suddenly” could catch unprepared industries off guard.

Urgency in Cybersecurity: The Race to Go Post-Quantum

One domain taking Neven’s Law especially seriously is cybersecurity. If quantum computing power is growing at anything close to a doubly exponential rate, the timelines for breaking encryption could dramatically accelerate. The specter of Suddenly, you’re in a different world looms large over cryptography. Modern public-key encryption (like RSA and ECC) relies on mathematical problems that are practically impossible for classical computers to solve in reasonable time. But a sufficiently powerful quantum computer will crack them – for instance, Shor’s algorithm running on a large quantum computer could factor the large semiprime numbers that underlie RSA, defeating the encryption. The question is when such a quantum machine will exist.

What Neven’s Law injects is a sense of urgency and uncertainty. If quantum capabilities truly explode in the coming decade, that “long while” could end up being not long at all. A blog from semiconductor firm NXP put it plainly: Although we only have 100-qubit computers today, Neven’s Law observes that quantum computers are gaining power at a double exponential rate, much more aggressive than Moore’s Law. If this upward trend continues, we may see quantum computations capable of solving real-world cryptographic problems in 10 to 15 years. Ten to fifteen years from 2022 lands around 2032–2037 – meaning the late 2030s could potentially see quantum decryption of today’s standard encryption. That estimate might even be pulled in if progress accelerates or if some breakthrough shortcuts the need for millions of physical qubits. In the national security community, this possibility has led to warnings of a looming “Y2Q” or Q-Day – the day when quantum computers can break current crypto, rendering all our secret communications readable.

Because of this, the urgency for post-quantum cryptography (PQC) has reached the highest levels. Governments and companies are essentially in a race against the clock – a clock whose alarm time is unknown, and which Neven’s Law suggests might ring sooner than expected. The U.S. government, for example, has not sat idle. In 2016, the National Institute of Standards and Technology (NIST) launched a worldwide competition to develop quantum-resistant encryption algorithms. That process concluded its first phase in 2022 with NIST selecting new PQC standards (like the CRYSTALS-Kyber algorithm for key exchange) to replace RSA and ECC in the coming years.

The reason for acting early is that cryptographic transitions are slow and complex. The world can’t flip its encryption algorithms overnight; standards must be updated, software and hardware must be upgraded, keys and certificates must be replaced. This could take years to do thoroughly. And every year of delay is a year in which more data could be covertly harvested.

Can We Trust Neven’s Law? (Experts Weigh In)

Given the high stakes, it’s worth scrutinizing how credible Neven’s Law is as a forecasting tool. Among quantum researchers, opinions vary. Some embrace the optimism it represents, while others urge caution. Skeptics point out that technology rarely follows neat mathematical laws indefinitely. University of Maryland’s Andrew Childs, for example, has expressed doubts that the interplay of quantum and classical progress is truly yielding a sustained double exponential trend. He and others stress that classical computing is a moving target – as quantum hardware improves, so do classical algorithms and hardware, potentially delaying the crossover point or flattening the comparative curve. Additionally, quantum research faces looming engineering hurdles: scaling from 100 qubits to 1,000 is one thing; scaling to a million (with fault tolerance) is a vastly harder challenge that might slow the pace significantly. Gil Kalai, a noted quantum computing skeptic, often argues that error rates and noise will fundamentally limit quantum machines from ever achieving the massive scalability required for cracking encryption. If he and like-minded critics are right, quantum growth could stall or reach an asymptote before upending the world of computing.

On the other side, optimists and insiders assert that while we shouldn’t take “doubly exponential” too literally, the momentum is real. Hartmut Neven’s own track record lends some weight – he made the prediction in early 2019, and by year’s end his lab hit a major milestone right on cue. Jonathan Dowling (for whom the “Dowling-Neven Law” is partially named) pointed out that he had charted a similar trend as early as 2013, projecting that if qubit counts doubled every few months, we’d get a “super-exponential” growth in quantum computing power. He saw Neven’s 2019 pronouncement as validation of that pattern. Dowling, until his passing in 2020, was a strong proponent of the idea that quantum computing progress was often under-estimated by outsiders. Many in the field share a sense that there’s a kind of Moore’s Law for qubits quietly in effect. For instance, IBM’s quantum division has publicly stated they aim to continue doubling the quantum volume (a combined measure of qubits and error rate performance) of their processors annually – which, if maintained, certainly qualifies as exponential growth. While this is not “doubly exponential” in the Neven sense, it’s still an impressive trajectory that could yield practical quantum advantage in the near term.

Some experts also note that Neven’s Law might apply in bursts rather than as a smooth curve. The lead-up to quantum supremacy at Google was one such burst – a confluence of improvements that suddenly left classical simulation in the dust. We may see future bursts as new breakthroughs occur (for example, if a new qubit architecture dramatically lowers error rates or if someone figures out a quasi-error-corrected design that scales much faster). During those bursts, progress can appear doubly exponential, only to slow down during the in-between phases. This punctuated advancement is common in nascent technologies.

Crucially, even those who question the degree of the exponent don’t deny the rapid progress in quantum computing. As Scott Aaronson remarked, the burden is now on skeptics to explain “where and why the progress will stop“. So far, no fundamental law of physics has emerged that says scalable quantum computers are impossible; on the contrary, each passing year of progress (in qubits, coherence times, error rates, etc.) erodes the earlier objections. We’ve moved from debating if quantum supremacy could ever be achieved to seeing it done. Now the debate shifts to how far and how fast we can go toward full-scale quantum computing. Neven’s Law, whether perfectly accurate or not, has served as a clarion call – a provocative reminder not to underestimate the quantum trajectory.

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven consulting firm empowering organizations to seize quantum opportunities and proactively defend against quantum threats. A former quantum entrepreneur, I’ve previously served as a Fortune Global 500 CISO, CTO, Big 4 partner, and leader at Accenture and IBM. Throughout my career, I’ve specialized in managing emerging tech risks, building and leading innovation labs focused on quantum security, AI security, and cyber-kinetic risks for global corporations, governments, and defense agencies. I regularly share insights on quantum technologies and emerging-tech cybersecurity at PostQuantum.com.
Share via
Copy link
Powered by Social Snap