Industry News

Google Announces Willow Quantum Chip

Santa Barbara, CA, USA (Dec 2024) – Google has unveiled a new quantum processor named “Willow”, marking a major milestone in the race toward practical quantum computing. The 105-qubit Willow chip demonstrates two breakthroughs that have long eluded researchers: it dramatically reduces error rates as qubit count scales up, and it completed a computational task in minutes that would take a classical supercomputer longer than the age of the universe. These achievements suggest Google’s quantum hardware is edging closer to the threshold of useful quantum advantage, paving the way for large-scale systems that could outperform classical computers on real-world problems.

Pushing Quantum Performance to New Heights

Google’s Quantum AI team built Willow as the successor to its 2019 Sycamore chip, roughly doubling the qubit count from 53 to 105 while vastly improving qubit quality. Crucially, Willow’s design isn’t just about adding more qubits – it’s about better qubits. In quantum computing, more qubits mean nothing if they’re too error-prone. Willow tackles this with engineering refinements that boost qubit coherence times to ~100 microseconds, about 5× longer than Sycamore’s 20 μs. That stability, combined with an average qubit connectivity of 3.47 in a 2D grid, gives Willow “best-in-class” performance on holistic benchmarks like quantum error correction and random circuit sampling.

In a standard benchmark test known as Random Circuit Sampling (RCS), Willow proved its mettle. It churned through a complex random circuit in under five minutes – an instance so computationally hard that today’s fastest classical supercomputer would need an estimated 10 septillion (10^25) years to do the same. This isn’t just a parlor trick; it’s a strong indicator that Willow has achieved a quantum “beyond-classical” regime. Hartmut Neven, founder of Google Quantum AI, noted that RCS is currently “the classically hardest benchmark” for a quantum processor, essentially a stress test to prove the quantum machine is doing something no normal computer could. The result builds on Google’s 2019 quantum supremacy experiment, but with a chip far more powerful than before – and it hints that useful quantum computing may arrive sooner than skeptics expect.

Perhaps Willow’s most significant feat is in quantum error correction – the decades-long quest to tame quantum errors. In tests, Google showed that by grouping physical qubits into a logical qubit “surface” and gradually enlarging that group, the error rate dropped instead of rising. Starting with a 3×3 qubit patch and scaling up to a 7×7 patch, Willow was able to cut logical error rates roughly in half. “This historic accomplishment is known in the field as below threshold – being able to drive errors down while scaling up the number of qubits,” Neven explained, calling it an “unfakeable sign” that error correction is materially improving the system. In practical terms, Willow is the first quantum chip to demonstrate error rates that improve (exponentially) with added qubits , a key proof-of-concept for building much larger fault-tolerant quantum computers. Google reports it even ran real-time error correction cycles on the chip during calculations, a notable first for superconducting qubits.

For more in-depth information about the Google’s error correction achievement, see the Nature paper accompanying the announcement: Quantum error correction below the surface code threshold.

Under the Hood of Willow’s Design

The Willow processor is built on Google’s preferred platform: superconducting transmon qubits arranged in a square lattice. Each qubit is a tiny circuit on a chip cooled to millikelvin temperatures. Willow was fabricated end-to-end in Google’s new custom quantum chip facility in Santa Barbara. This tight vertical integration – from materials to fabrication to cryogenics – was key to its success. Anthony Megrant, Google Quantum AI’s chief architect, noted that the company moved from a shared university fab into its own cleanroom to produce Willow, which speeds up the iteration cycle for new designs. All quantum operations (single-qubit gates, two-qubit gates, state reset, and readout) were co-optimized in Willow’s design, ensuring no one component lags behind. The result is a balanced system where every part works in harmony – critical, because any weak link would drag down the overall fidelity.

To appreciate Willow’s performance, consider its coherence and gate quality metrics. Google reports qubit T1 coherence times (how long a qubit can retain its state) approaching 100 µs. Gate fidelities are not explicitly stated in the announcement, but the successful error-correction experiment implies extremely high fidelity two-qubit gates and measurement reliability. In fact, Willow’s holistic benchmark results now rival or exceed other platforms. For example, in random circuit sampling, Willow outpaces one of the world’s most powerful classical supercomputers by an overwhelming margin. A performance table released by Google shows Willow leading in key specs among contemporary quantum chips, underscoring that the focus on “quality, not just quantity” of qubits has paid off.

This chip is still a far cry from a general-purpose quantum computer, but it bridges an important gap. So far, quantum demos have fallen into two buckets: contrived mathematical challenges beyond classical reach (like RCS), or useful simulations that could still be done with classical supercomputers given enough time. Willow aims to do both at once – reach beyond-classical computational power and tackle problems with real-world relevance. “The next challenge for the field is to demonstrate a first ‘useful, beyond-classical’ computation on today’s quantum chips that is relevant to a real-world application,” Neven wrote, expressing optimism that the Willow generation can hit that goal.

Google vs. IBM, and the Quantum Competition

The quantum computing race has several heavyweights, and Google’s announcement comes on the heels of notable advances by others. IBM, for instance, recently introduced “Condor,” the world’s first quantum processor to break the 1,000-qubit barrier with 1,121 superconducting qubits. IBM’s approach has emphasized scaling up qubit counts and linking smaller chips into larger ensembles. Its roadmap envisions modular systems and fault-tolerant quantum computing by around 2030. In fact, IBM has publicly targeted having useful error-corrected qubits by the end of this decade, enabled by iterative improvements in qubit design (their next-gen chips called Flamingo, etc.). IBM’s current hardware (433-qubit “Osprey” and 127-qubit “Eagle”, among others) still operates in the noisy, error-prone regime, but IBM has shown steady progress in error mitigation techniques and complexity of circuits. Earlier this year, IBM demonstrated it could run quantum circuits of 100+ qubits and 3,000 gates that defy brute-force classical simulation – a similar “beyond classical” milestone, albeit without full error correction. Condor, with its record qubit count, has performance comparable to IBM’s earlier 433-qubit device, indicating that simply adding qubits isn’t enough without boosting fidelity. This is where Google’s Willow differs: Google chose to keep qubit count modest while achieving an exponential reduction in errors through better engineering. It’s a quality-vs-quantity trade-off playing out in real time.

Beyond the big players, a number of startups and research labs are also in the fray. IonQ, for example, uses trapped-ion qubits and recently reported a record 36 “algorithmic qubits” (AQ) on its latest system, meaning it can effectively utilize 36 high-fidelity qubits for real-world algorithms. While 36 useful qubits is far below Google’s raw qubit count, IonQ emphasizes that each increment in their AQ metric doubles the computational space, with 36 AQ representing the ability to consider over 68 billion states simultaneously. IonQ’s systems have the advantage of all-to-all connectivity between qubits and room-temperature operation, but they are slower than superconducting qubits. Other contenders like Rigetti and academic efforts are also exploring hybrid approaches and improved error mitigation. In short, the quantum computing landscape is vibrant: Google’s Willow ups the ante on error correction; IBM is pushing scale; Microsoft is rethinking qubit physics; and others are finding niches in between. Each approach—superconducting, topological, trapped-ion, photonic, etc.—comes with trade-offs, but all share the common goal of reaching a threshold where useful quantum computation outperforms classical for important tasks.

Scaling Up: Challenges on the Road to Quantum Utility

Even with the excitement around Willow, formidable challenges remain before quantum computers become a mainstream tool. Google’s latest accomplishment, while impressive, was essentially a one-off demonstration on a specialized benchmark. Researchers caution that general-purpose, fault-tolerant quantum computers will require orders of magnitude more qubits and further breakthroughs in error correction. As a sober reminder, Google’s own team estimates that breaking modern cryptography (like 2048-bit RSA encryption) with a quantum computer is at least 10 years away. Hartmut Neven told BBC News that he doesn’t expect a commercial quantum chip to be available before the end of the decade. In other words, 2020s quantum chips are still experimental prototypes, not ready to run your everyday computing tasks. Willow’s error-correction win involved a relatively small logical qubit (a 49-qubit surface code); achieving error-corrected multi-qubit operations and scaling to thousands or millions of physical qubits for complex algorithms is a whole new mountain to climb.

One big challenge is scaling without introducing new errors. As more qubits and components are added, maintaining ultra-low noise in a cryogenic environment becomes exponentially harder. The Willow chip’s 105 qubits already demand extremely precise control systems and cryostat cooling to around 10 millikelvins. Future devices may need integrated cryo-control electronics, more microwave lines, and perhaps modular architectures to keep things manageable. IBM, for instance, is planning to connect smaller chips (like its 133-qubit “Heron” processors) into larger ensembles to scale up while isolating error zones. Google may pursue a similar modular strategy or refine its surface code further so that each logical qubit is built from, say, 100 physical qubits instead of 1,000+. Manufacturing yield is another issue – building hundreds or thousands of high-quality qubits on a chip without defects will tax even cutting-edge fabrication processes. Google did invest in a dedicated fab for this reason, to iterate faster and learn how to manufacture at scale.

There’s also the matter of software and algorithms. To truly commercialize quantum computing, it’s not enough to have the hardware; one needs algorithms that solve valuable problems faster or better than classical methods. So far, many quantum algorithms (for chemistry, optimization, machine learning, etc.) have been tested only on small scales or in simulation. Google and others are actively developing new quantum algorithms and collaborating with industry and academia to find “useful, beyond-classical” applications. But until a quantum computer can solve a practical problem (say, simulate a complex molecule or optimize a large logistic network) better than a traditional supercomputer, it will remain a niche technology. The timeline for “quantum advantage” in practical tasks is still uncertain. Google’s team is optimistic, especially after Willow – they think it’s plausible in the next few years with further refinements. In fact, bolstered by Willow’s success, Google’s leadership suggested that commercially relevant quantum computing might be only five years away , significantly sooner than previous forecasts. IBM is a bit more conservative, eyeing the early 2030s for fault-tolerant systems, and some skeptics (like Nvidia’s CEO) had thought it could be 20 years off . The consensus is shifting toward sooner-than-expected, but it will require solving engineering problems at every level, from qubit materials up to software.

Finally, cost and infrastructure pose non-trivial barriers. Quantum hardware must operate at extreme conditions – Willow’s qubits reside in a custom-built dilution refrigerator (a cryostat) that towers overhead with coils of wiring and multiple cooling stages. These machines are expensive to build and maintain. As Semafor quipped, the next big challenge after achieving this quantum breakthrough is bringing the costs down for wider deployment. Companies like IBM and Google are exploring quantum cloud services, where users can access quantum processors remotely. But for quantum computing to truly proliferate, the devices will need to become more robust, automated, and cheaper – perhaps using error-corrected qubits to relax some hardware burdens, or new technologies to eliminate the need for massive cryogenics. All of these challenges – scaling, error correction, software, and cost – will require sustained innovation over the coming years. As Neven put it, “we invite researchers, engineers, and developers to join us on this journey” of solving these problems. In the meantime, each incremental breakthrough, like Willow, is a crucial stepping stone.

Why Willow Matters: Toward Quantum-Powered Industry and Science

The rapid advances in chips like Willow are not just an academic exercise – they carry profound implications for technology, industry, and science. A functioning, large-scale quantum computer promises to unlock solutions to problems that are effectively intractable for classical computers. Drug discovery and molecular simulation is one oft-cited example: quantum machines could simulate the quantum behavior of complex molecules and materials with far more accuracy, helping chemists design new medicines or high-temperature superconductors. Materials science and battery design stand to benefit similarly, as quantum computers could model atomic interactions in new materials or electrolytes that classical methods struggle with. Optimization and logistics is another domain: companies face notoriously hard problems in optimizing supply chains, scheduling, and routing (think airline schedules or delivery routes). Certain optimization algorithms might see exponential speedups on quantum hardware, enabling more efficient operations in industries like manufacturing, transportation, or energy grid management. Financial services could use quantum algorithms for risk modeling or portfolio optimization given enough stable qubits.

Google explicitly hopes quantum computers will help tackle “unsolvable” problems in medicine, energy, and AI, among others . In fact, Sundar Pichai pointed to fusion energy and climate change as grand challenges that quantum computing might help address in the future . Machine learning is another exciting frontier – hybrid quantum-classical algorithms might one day boost AI capabilities or enable entirely new forms of data analysis. Hartmut Neven has drawn a connection between quantum computing and AI, noting that advanced AI systems could themselves benefit from quantum computational power. For example, quantum optimization could improve machine learning training, or quantum sampling might help AI models become more robust. This synergy is one reason Google’s lab is literally called “Quantum AI.”

That said, it’s important to temper expectations: practical applications will roll out gradually. We may first see quantum computers used in cloud services for specialized tasks – for instance, a pharmaceutical company might use Google’s or IBM’s quantum cloud to evaluate a chemical reaction mechanism that’s beyond classical simulation. Over time, as machines get more powerful, the range of applications will widen. Governments and enterprises are already investing in quantum software development, anticipating a future where this technology becomes a strategic asset. Notably, the progress of Willow and its peers also has a security dimension: large quantum computers could eventually break certain cryptographic codes, which is why efforts in post-quantum cryptography are racing to stay ahead. Google’s team downplays any immediate threat – as mentioned, they estimate at least a decade before quantum code-breaking is feasible – but the clock is ticking for upgrading encryption standards across the internet.

In the grand scheme, the “Willow” announcement is a bellwether for the quantum computing field. It signals that the era of quantum advantage is drawing closer, moving from laboratory curiosity to a tool that businesses and scientists can envision using. As one analyst remarked in light of Google’s news, Willow’s exponential error reduction “could lead to major breakthroughs and discoveries across industries” once scaled up . The achievement also adds fuel to the competitive fire among tech giants. Alphabet’s stock got a modest boost after the reveal, reflecting investor recognition that quantum computing could be a key technology race of the next decade. With IBM pushing toward a quantum-centric supercomputing future and Microsoft betting on a topologically protected leapfrog, Google’s Willow puts it firmly among the frontrunners.

In summary, Google’s Willow chip represents a significant “quantum leap” in computing – not just in raw power, but in showing that the vexing problem of quantum error correction can be solved in practice. There are many milestones yet to reach, but the path to a useful, large-scale quantum computer looks clearer than ever. If progress continues at this clip, the coming years could see quantum machines transition from research labs to real-world deployments, tackling problems once deemed unattainable. For a tech-savvy world facing ever more complex challenges, that is a development worth watching closely. The quantum revolution that once seemed perpetually distant is, with Willow’s debut, feeling tantalizingly within reach.

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven professional services firm dedicated to helping organizations unlock the transformative power of quantum technologies. Alongside leading its specialized service, Secure Quantum (SecureQuantum.com)—focused on quantum resilience and post-quantum cryptography—I also invest in cutting-edge quantum ventures through Quantum.Partners. Currently, I’m completing a PhD in Quantum Computing and authoring an upcoming book “Practical Quantum Resistance” (QuantumResistance.com) while regularly sharing news and insights on quantum computing and quantum security at PostQuantum.com. I’m primarily a cybersecurity and tech risk expert with more than three decades of experience, particularly in critical infrastructure cyber protection. That focus drew me into quantum computing in the early 2000s, and I’ve been captivated by its opportunities and risks ever since. So my experience in quantum tech stretches back decades, having previously founded Boston Photonics and PQ Defense where I engaged in quantum-related R&D well before the field’s mainstream emergence. Today, with quantum computing finally on the horizon, I’ve returned to a 100% focus on quantum technology and its associated risks—drawing on my quantum and AI background, decades of cybersecurity expertise, and experience overseeing major technology transformations—all to help organizations and nations safeguard themselves against quantum threats and capitalize on quantum-driven opportunities.
Share via
Copy link
Powered by Social Snap