Industry News

IBM Osprey: A 433-Qubit Quantum Leap

Yorktown Heights, N.Y., USA (Nov 2022)IBM has announced Osprey, a superconducting quantum processor with a record-breaking 433 qubits – by far the largest of its kind as of its 2022 debut. Revealed at the IBM Quantum Summit in November 2022, Osprey more than triples the qubit count of IBM’s previous 127-qubit Eagle chip​. IBM says this new processor “brings us a step closer to the point where quantum computers will be used to tackle previously unsolvable problems,” according to Dr. Darío Gil, IBM’s Director of Research​. In principle, a state on the 433-qubit Osprey has an information content so enormous that the number of classical bits required to represent it “far exceeds” the total number of atoms in the known universe​. While practical quantum applications remain nascent, the Osprey chip’s sheer scale marks a major milestone in the quest to transcend classical computing limits.​

Largest Superconducting Processor to Date – and Why It Matters

Osprey’s 433 qubits vault IBM well ahead of prior superconducting quantum efforts in raw qubit count. Its predecessor Eagle (127 qubits) had only broken the 100-qubit barrier a year earlier in 2021​​. Competing devices like Google’s 53-qubit Sycamore (which achieved the first quantum “supremacy” demonstration in 2019) and China’s Zuchongzhi processors (66 qubits) now look modest by comparison in size​​. Scale is not everything, but it is a critical ingredient: more qubits allow more complex computations and larger entangled states. In fact, in 2021 a Chinese team led by Jian-Wei Pan used a 56-qubit subset of their 66-qubit Zuchongzhi 2.0 processor to perform a random circuit sampling task beyond what Google’s 53-qubit Sycamore could do​. The IBM Osprey, with nearly 4× Eagle’s qubits, pushes hardware scale into new territory. IBM estimates that Osprey could run quantum circuits of a complexity no classical supercomputer can realistically simulate​. This gap between quantum state space and classical simulation grows exponentially with qubit count, underscoring why each jump in qubits is celebrated. Osprey’s debut thus signals that IBM is aggressively scaling up quantum hardware on the path toward quantum advantage – the point at which quantum computers solve useful problems impractical for classical machines.

Engineering Breakthroughs Under the Hood

Achieving 433 working qubits on a single chip demanded significant hardware innovation. Osprey builds on the 3D multi-layer architecture IBM introduced with Eagle​​. In earlier IBM designs (like the 27-qubit Falcon and 65-qubit Hummingbird), qubits and wiring were confined to just one or two layers, which limited qubit density​. Eagle was IBM’s first to use a four-layer stack: a qubit plane, a separate readout resonator plane, a wiring plane, and an interposer layer for signal delivery​​. This allowed control wiring to be routed through multiple chip layers, freeing up space to add many more qubits without sacrificing performance​. Osprey adopts a similar architecture but at a much larger scale, nearly quadrupling the chip size​. According to IBM, the Osprey family required further enhancements to device packaging and new high-density signal cabling to handle the greater I/O demands within the same cryogenic refrigerator​. In essence, IBM had to figure out how to pack three times more qubits onto a chip and get signals in and out of them efficiently – a task that called for creative tweaks in design and fabrication.​

One key innovation is the use of flexible ribbon cables for signal delivery inside the cryostat, replacing many of the bulky hand-crafted coaxial lines​. These flex cables, developed to be cryogenically compatible, can be layered and “stacked in a staircase fashion” to connect the various temperature stages and the chip with much higher wiring density​​. IBM’s engineers report that this flex wiring increased the available signal lines by about 70% within the same physical volume, while also reducing cost per line​​. In practice, that means Osprey’s fridge can support the hundreds of microwave control and readout lines needed for 433 qubits without a complete redesign of the infrastructure. Dr. Jerry Chow, IBM’s director of quantum hardware system development, explained that scaling to Osprey’s 3× larger qubit count required “further developing and scaling the multi-level wiring common in Eagle, and optimizing it to pack more qubits together and route them.”​​ The multi-layer chip wiring and new cryogenic flex cabling work in tandem to address the I/O bottleneck that generally makes controlling hundreds of qubits extraordinarily challenging​.

Maintaining Qubit Quality at Scale

Simply adding qubits is not enough – their fidelity and coherence must be high for the processor to be useful. Here, IBM faced a balancing act. Superconducting qubits (transmons) must operate at millikelvin temperatures in a noise-isolated environment. As more qubits and components are added, issues like crosstalk, signal attenuation, and heat load can worsen. IBM says Osprey incorporates integrated filtering on-chip to reduce noise and improve stability, in addition to the layered design for signal routing​​. Even so, early performance data hinted that Osprey’s first revision (r1) had somewhat shorter coherence times than the best smaller chips. The median coherence (decay) time T₁ for Osprey qubits is around 70–80 microseconds in this initial version​​. For comparison, IBM’s 127-qubit Eagle r1 had T₁ ~100 µs, and subsequent refinements pushed Eagle’s T₁ up to ~300 µs in a later revision​​. Google’s Sycamore qubits, by contrast, had T₁ ≈ 25 µs​. So Osprey r1’s coherence is on par with or slightly better than many earlier devices, but IBM is already working on an Osprey r2 with improved materials and calibration to boost coherence similarly by 3×​​. “The second revision of Osprey, which is already being put together, shows a similar improvement in coherence times,” says Chow​​. Achieving long-lived qubits on such a large chip is an ongoing challenge – incremental progress here will directly increase the circuit depth Osprey can run before errors dominate.

Crucially, error rates must be kept in check as qubit count grows. IBM had not released full error benchmarking for Osprey at the time of launch​. However, the company’s experience with Eagle suggests initial two-qubit gate error rates on a new large chip can be on the order of several percent​, which limits the “quantum volume” (a holistic performance metric) until improvements are made. Indeed, an early Eagle (127-qubit) system showed ~8% two-qubit error in 2021​, but a later “Eagle v3” achieved around 2% two-qubit error with ~160 µs coherence​ – a dramatic improvement approaching the fidelity of IBM’s smaller 27-qubit devices​. IBM’s strategy has been to roll out new processors to demonstrate scaling, then iterate on their design and calibration to raise fidelity. We can expect Osprey to undergo a similar tuning process. In fact, IBM’s Quantum Roadmap anticipates a near-term focus on improving error rates even if it means using fewer qubits. This shows IBM’s two-pronged approach – push qubit count higher (Osprey, and a planned 1,121-qubit Condor chip), while also engineering better qubits – ultimately to marry both scale and quality in future systems.

Comparisons with Eagle, Sycamore, and Zuchongzhi

In terms of architecture, Osprey isn’t a radical departure – it’s an evolutionary scale-up of IBM’s superconducting transmon platform. It retains the heavy-hexagonal qubit layout introduced in prior IBM chips (a hexagonal lattice connectivity that limits each qubit’s neighbors to reduce frequency collisions)​. Osprey’s qubits are likely coupled in a similar pattern to Eagle’s, just over a larger area. Jay Gambetta, IBM Quantum’s VP, noted that Osprey “uses many of the same technologies and designs” as Eagle – “like a hexagon lattice structure on the chip surface that holds all the qubits.”​​ By contrast, Google’s Sycamore employed a rectangular grid of 53 qubits with tunable couplers and achieved a milestone in 2019 by randomly sampling quantum circuits that were infeasible to simulate classically. That feat sparked the “quantum supremacy” (or quantum advantage) race. IBM famously contested Google’s claim at the time by showing improved classical algorithms could simulate Sycamore’s task faster than initially estimated (in days, not millennia). Still, Sycamore demonstrated that a modest number of high-fidelity qubits could outperform a supercomputer on a contrived task.

The USTC group in China responded by pushing the envelope further. In 2020 they demonstrated quantum advantage with a photonic system (Jiuzhang), and in 2021 with a superconducting chip called Zuchongzhi 2.0​​. The Zuchongzhi 2.0 processor had 66 qubits arranged in a 2D array with 110 tunable couplers to mediate flexible interactions​​. Pan’s team ran random circuits on 56 of those qubits at 20 cycles deep, and showed this was about 106 times harder to simulate than Google’s original 53-qubit experiment​.

Implications for the Industry and What Comes Next

The debut of Osprey highlights the rapid progress in superconducting quantum technology and intensifies the global R&D race. For IBM’s quantum roadmap, Osprey was the 2022 goalpost, to be followed by the 1,121-qubit Condor chip (planned for 2023)​. Beyond that, IBM aims to reach tens of thousands of qubits by the end of the decade through multi-chip scaling (for example, the 4,158-qubit “Kookaburra” system around 2025)​. However, IBM and others recognize that simply scaling up qubits without reducing errors won’t deliver useful quantum advantage. The industry trend is toward strategies like quantum error mitigation and interim error-corrected demonstrations. IBM has been developing software techniques (e.g. circuit knitting, zero-noise extrapolation) to wring more effective performance out of noisy processors. In parallel, as seen with the Heron chip, they are exploring architectural changes (like tunable couplers and faster control electronics) to boost fidelity​​. The long-term vision is to reach a point where quantum computers can tackle problems in fields like chemistry, optimization, and materials science that classical computers cannot handle – problems often described as “unsolvable” today​. Osprey brings that vision a small but significant step closer by proving that a few hundred qubits can be built and operated together. This opens the door for researchers to experiment with larger quantum circuits and new error suppression techniques on a scale that was previously inaccessible.

From an enterprise and cybersecurity perspective, Osprey’s arrival is another reminder that quantum computing is swiftly advancing, even if it hasn’t yet cracked commercially valuable problems. Companies like Bosch and Vodafone joined the IBM Quantum Network around this time, signaling interest in being “quantum-ready.”​​ In fact, IBM announced a partnership with Vodafone to explore quantum-safe cryptography, reflecting the anticipation that future quantum computers will eventually threaten classical encryption​. It’s important to note that Osprey’s 433 noisy qubits pose no immediate danger to RSA or other cryptosystems – experts estimate it would take on the order of millions of high-fidelity qubits to crack RSA-2048 via Shor’s algorithm​​. IBM’s own researchers emphasize that quantum-safe measures adopted now are about “protecting against a very distant threat.”​ Nonetheless, the march of progress exemplified by chips like Osprey keeps that future threat on the radar, and it spurs investment in post-quantum encryption and hybrid quantum-classical cloud offerings.

In summary, IBM’s 433-qubit Osprey processor represents a significant technical achievement in superconducting quantum computing, marking the largest qubit count on a single chip at the time of its unveiling. It builds on proven design principles (multi-layer wiring, fixed-frequency transmons in a heavy-hex lattice) and extends them to a new scale with innovative engineering solutions for wiring and packaging​​. Osprey did not immediately deliver a breakthrough computation, but it provides a platform to explore quantum computations of unprecedented size and complexity in the NISQ (noisy intermediate-scale quantum) regime. The lessons learned from Osprey are already influencing next-generation devices that aim to combine high qubit counts with low error rates​​. While much work remains to achieve practical quantum advantage – including improving coherence, fidelity, and eventually implementing error correction – the Osprey chip is a crucial step forward. It demonstrates that the scaling curve for superconducting qubits is continuing upward, keeping IBM and the broader field on track toward the lofty goal of fault-tolerant quantum computing. For the tech industry and research community, Osprey’s arrival is both a proof of progress and a call to action: larger quantum processors are becoming reality, and now is the time to develop the software, algorithms, and error-mitigation techniques to harness their growing computational power​​.

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven professional services firm dedicated to helping organizations unlock the transformative power of quantum technologies. Alongside leading its specialized service, Secure Quantum (SecureQuantum.com)—focused on quantum resilience and post-quantum cryptography—I also invest in cutting-edge quantum ventures through Quantum.Partners. Currently, I’m completing a PhD in Quantum Computing and authoring an upcoming book “Practical Quantum Resistance” (QuantumResistance.com) while regularly sharing news and insights on quantum computing and quantum security at PostQuantum.com. I’m primarily a cybersecurity and tech risk expert with more than three decades of experience, particularly in critical infrastructure cyber protection. That focus drew me into quantum computing in the early 2000s, and I’ve been captivated by its opportunities and risks ever since. So my experience in quantum tech stretches back decades, having previously founded Boston Photonics and PQ Defense where I engaged in quantum-related R&D well before the field’s mainstream emergence. Today, with quantum computing finally on the horizon, I’ve returned to a 100% focus on quantum technology and its associated risks—drawing on my quantum and AI background, decades of cybersecurity expertise, and experience overseeing major technology transformations—all to help organizations and nations safeguard themselves against quantum threats and capitalize on quantum-driven opportunities.
Share via
Copy link
Powered by Social Snap