Atom Computing
Table of Contents
(This profile is one entry in my 2025 series on quantum hardware roadmaps and CRQC risk. For the cross‑vendor overview, filters, and links to all companies, see Quantum Hardware Companies and Roadmaps Comparison 2025.)
Introduction
Atom Computing is a fast-rising startup developing gate-based quantum computers using optically trapped neutral atoms as qubits. The company made headlines in late 2023 by announcing a 1,225-site optical atom array populated with 1,180 (physical) qubits – the first universal quantum platform to surpass 1,000 qubits. This dramatic leap (from a ~100-qubit first-generation system to >1,000 qubits in one generation) showcases the inherent scalability of Atom’s neutral-atom approach. Atom’s qubits are encoded in the nuclear spin states of neutral atoms, which yields exceptionally long coherence times (on the order of ~40 seconds). Combined with all-to-all connectivity enabled by mobile laser-trapped atoms, this platform has been designed from the outset with fault-tolerant quantum computing in mind. The startup’s aggressive progress and focus on error correction have quickly made it a serious contender in the race toward practical, large-scale quantum computers.
Milestones & Roadmap
2018: Atom Computing is founded (Berkeley, CA) by Dr. Ben Bloom with the vision of leveraging neutral atoms for scalable quantum computing. (Within five years, the company would “go up against larger companies” and hold its own, thanks to laser-focused technology development.)
July 2021: The team unveils “Phoenix,” its first-generation quantum system capable of trapping ~100 neutral atoms as qubits. Phoenix demonstrated “astonishing stability,” including a world-record coherence time of ~40 seconds for a qubit’s quantum state. Building Phoenix in under two years made Atom the fastest company to deliver a 100-qubit prototype at that time.
Early 2023: Atom Computing is selected by DARPA for the US2QC program (Underexplored Systems for Utility-Scale Quantum Computing) to accelerate development of utility-scale quantum hardware. Under this project, Atom focuses on scaling up atom arrays, enhancing qubit connectivity, and implementing advanced quantum error correction towards fault-tolerant architectures. This selection reflected confidence that Atom’s roadmap could achieve utility-scale quantum operation “much sooner than conventional predictions“.
Oct 2023: The company announces a 1,180-qubit prototype (1,225 possible sites) in its second-generation platform – marking the first 1,000+ qubit gate-based quantum computer in the industry. “This order-of-magnitude leap – from 100 to 1,000-plus qubits within a generation – shows our atomic array systems are quickly gaining ground”, noted CEO Rob Hays. Achieving the 1K-qubit milestone is seen as a major step toward fault tolerance, since large qubit counts will be required for error-corrected logical qubits. Atom also demonstrated key capabilities on this platform, including mid-circuit measurement (measuring a qubit’s state during computation without disturbing others) and other error-mitigation techniques.
Late 2024: In partnership with Microsoft, Atom Computing shows a breakthrough in quantum error correction on neutral atoms. Using Microsoft’s Azure Quantum error-correction stack on Atom’s hardware, the team created and entangled 24 logical qubits – the largest number of logical qubits entangled to date. They also performed real-time detection and correction of qubit loss errors during operation, an important step toward fault tolerance. Moreover, they successfully ran a benchmark algorithm (Bernstein-Vazirani) on 28 logical qubits, with the error-corrected logical qubits yielding more accurate results than uncorrected physical qubits. This demonstrated the first instance of reliable computation on a commercial neutral-atom quantum system with live error correction.
2025 (planned): Atom and Microsoft have announced plans to deliver an integrated quantum machine to select customers by 2025, combining Atom’s >1,000-qubit neutral-atom hardware with Azure’s cloud and qubit virtualization software. This system – offered both via Azure cloud and as on-premises hardware – will provide “reliable logical qubits on a commercial quantum machine,” aiming to empower early real-world applications in chemistry, materials science, and more. Notably, it will be one of the first commercial quantum computers delivered with error-corrected logical qubits, targeting on the order of ~50 logical qubits initially.
Beyond 2025: While Atom Computing hasn’t published a detailed roadmap, the company has explicitly outlined its future plans in public statements and a 2025 whitepaper. In that whitepaper, Atom states that each new generation of its neutral-atom quantum platform is expected to achieve roughly an order-of-magnitude (10×) increase in qubit count. Following this trajectory, Atom’s next generation is projected to expand from ~1,000 qubits into the >10,000 physical qubit range, which the company anticipates will yield over 100 error-corrected logical qubits on that system. This aggressive scaling roadmap aligns with Atom’s participation in DARPA’s programs aimed at “utility-scale” quantum computers by 2033. Indeed, founder Ben Bloom has remarked that Atom’s technology and roadmap are on track with DARPA’s timeline for achieving a fault-tolerant, million-qubit quantum computer by the early 2030s. In practical terms, Atom Computing is planning to continue 10× growth per generation – enabling a handful of logical qubits today, on the order of tens to ~100 logical qubits in the next few years, and eventually scaling to thousands of logical qubits in the 2030 timeframe, which would be sufficient for impactful, fault-tolerant quantum computing.
Focus on Fault Tolerance
From the beginning, Atom Computing has prioritized techniques for quantum error correction and fault tolerance on its neutral-atom platform. A signature capability is mid-circuit measurement, which Atom demonstrated on neutral atoms in 2023. This means the system can measure the quantum state of individual qubits during the middle of a computation to check for errors, without collapsing the state of other qubits. Mid-circuit readout allows the detection of errors (or loss of an atom) on ancilla qubits in real time, enabling the quantum program to correct those errors on the fly and proceed – a crucial requirement for implementing iterative error-correction codes. Atom’s hardware uniquely allows qubit measurement, state reset, and reuse within a single algorithm cycle, which is “nearly unmatched” in the industry. Using this capability, the company has built a software stack with real-time conditional logic, so that if an error is detected via an ancilla qubit measurement, the system can branch and apply corrective operations immediately.
These technical advances bore fruit in late 2024 when Atom Computing, in collaboration with Microsoft, achieved a record 24 entangled logical qubits on a neutral-atom QPU. Each logical qubit was encoded across multiple physical qubits using an error-correcting code, so that the ensemble could detect and compensate for errors like decoherence or even atom loss. In the experiment, the team was able to actively identify when a neutral atom qubit dropped out of its trap and then correct for that loss during the computation. By encoding information redundantly, the logical qubits exhibited error rates far below those of individual physical qubits, confirming that entanglement and error correction were working as intended. In fact, after detecting errors and correcting losses, the logical qubits’ error rate improved by a factor of 4× compared to the baseline physical qubits. The culmination of these efforts was running a small algorithm (the Bernstein-Vazirani problem) on 28 logical qubits with successful results: the error-corrected logical qubits produced a more accurate answer than an attempt using the same number of raw physical qubits. This demonstrated that Atom’s nascent logical-qubit system can outperform an equivalently sized “noisy” quantum system by leveraging fault-tolerant techniques – a pivotal proof-of-concept on the road to scalable quantum computing. Moving forward, Atom is focusing on increasing the distance of its error-correcting codes (for even lower logical error rates) as larger qubit counts come online, and incorporating features like qubit reinitialization and continuous atom replacement to extend algorithm runtimes. Overall, Atom Computing’s early and sustained emphasis on mid-circuit control, error detection, and QEC puts it at the forefront of the quest for truly fault-tolerant quantum processors.
CRQC Implications
The rapid progress of Atom Computing’s neutral-atom architecture carries significant implications for the timeline of achieving a CRQC (Cryptographically Relevant Quantum Computer) – i.e. a quantum machine capable of breaking modern cryptographic codes. By most estimates, breaking strong encryption (like RSA-2048) will require on the order of thousands of error-corrected logical qubits, corresponding to millions of physical qubits if using traditional QEC codes. While no current system is near that scale, Atom’s roadmap of exponential qubit growth suggests that a CRQC might be reached sooner than many expected. The company’s 2023 prototype already demonstrated ~1k physical qubits, and plans are in motion to reach >10k qubits in the next generation, with further orders-of-magnitude scaling foreseen. If Atom can continue this trajectory – along with improving fidelities and error correction – then crossing the threshold into the hundreds of thousands or millions of qubits needed for cryptographically relevant algorithms could plausibly happen within the next decade or so. In fact, industry experts are beginning to warn that a capable code-breaking quantum computer could emerge by the early-to-mid 2030s. Atom Computing’s aggressive scaling and early fault-tolerance achievements reinforce these projections – indicating that the leap to a CRQC might arrive on the optimistic side of current timelines if such platforms continue to mature.
For cybersecurity, the accelerating progress at companies like Atom is a double-edged sword. On one hand, it heralds powerful quantum computers that could solve complex problems; on the other, it underscores that public-key encryption’s days are numbered, and the world must prepare defenses against quantum attacks. Atom’s partnership with Microsoft to deliver a 1,000+ qubit system with logical qubit capability by 2025 suggests that small error-corrected quantum machines will soon be in real users’ hands. While these near-term devices won’t break RSA, they are critical stepping stones toward the “Q-day” when a cryptographically relevant quantum computer comes online. Each generation that Atom Computing delivers – say 10k qubits, then 100k, then 1 million – will shrink the gap to that milestone. Notably, Atom’s focus on fault tolerance is directly applicable to cryptographic workloads: only a fault-tolerant quantum computer could execute the deep, complex algorithms (like Shor’s factoring) needed to crack encryption. By investing in error correction now and proving it at modest scales, Atom is effectively building the foundation required for a future CRQC. If their neutral-atom approach continues to scale with the anticipated order-of-magnitude jumps, it could significantly accelerate the countdown to a cryptography-breaking quantum machine. This possibility adds urgency for governments and enterprises to transition to post-quantum cryptography well before such hardware becomes operational. In summary, Atom Computing’s roadmap – if realized – suggests that the first cryptography-threatening quantum computers could arrive on an aggressive timeline, highlighting the company’s pivotal role in both advancing quantum technology and ringing the alarm for quantum cybersecurity.
Modality & Strengths/Trade-offs
Atom Computing’s technology is based on neutral atom qubits – specifically, arrays of individual atoms (like strontium or ytterbium) trapped in a vacuum by intersecting laser beams (optical tweezers). This physical modality offers several distinct advantages. First, neutral atoms have no electric charge, so they can be held just a few micrometers apart without strongly perturbing each other. This allows Atom to pack large 2D arrays of qubits in a relatively small space (thousands of atoms within a millimeter-scale grid) and even rearrange atom positions to achieve flexible, all-to-all connectivity for two-qubit gates.
Second, using atomic nuclear spin states to encode qubits gives extraordinary isolation from environmental noise – yielding very long coherence times (dozens of seconds) compared to the microseconds or milliseconds of superconducting or certain solid-state qubits. Such long qubit memory is a huge boon for executing complex circuits and error correction cycles before decoherence sets in.
Third, all atoms of the same element are virtually identical, which means neutral-atom qubits don’t suffer from manufacturing variability; this intrinsic uniformity avoids many calibration and yield issues seen in solid-state qubit fabrication. Finally, Atom Computing’s platform operates at (or near) room temperature – aside from the vacuum chamber – relying on lasers and standard vacuum tech rather than the sub-Kelvin refrigerators required for superconducting qubits. The relative simplicity and low-temperature operation of neutral atom hardware could translate to practical advantages in scaling up: systems might be easier to maintain, with lower power consumption and fewer specialized infrastructure requirements.
Despite these strengths, neutral atom quantum computing comes with its own set of challenges and trade-offs. One is the precision control needed to manipulate many individual atoms with laser beams. Scaling to hundreds or thousands of atoms means carefully steering and tuning a forest of laser interactions – an “experimental challenge” that grows with system size. Engineering stable optical traps and beam alignment for 1,000+ qubits is non-trivial, though Atom’s recent results show it’s feasible.
Another trade-off is the speed of gate operations. Two-qubit logic gates on neutral atoms (often implemented via exciting atoms to high-energy Rydberg states to induce interactions) tend to be slower than in superconducting circuits – on the order of microseconds per gate, versus tens of nanoseconds for superconducting qubits. While neutral-atom gates are still reasonably fast (and in some cases faster than trapped-ion gates), the slower operation rate means algorithms may take longer to run, putting more demand on coherence time. Fortunately, Atom’s qubits have exceptional coherence to buffer this, and parallel operations across the array can offset some speed limits (many atoms can be driven simultaneously).
A related challenge is fidelity: historically, achieving high-fidelity two-qubit gates in Rydberg-atom systems was difficult, but Atom Computing has recently reported 99.6% two-qubit gate fidelity – the highest for any commercial neutral-atom system. This fidelity is on par with some superconducting and ion-trap systems, indicating the gap is closing. Still, maintaining such fidelities across thousands of qubits will be an ongoing battle, requiring improvements in laser stability, atom cooling, and error calibration.
Another inherent issue with neutral atoms is the possibility of atom loss – if an atom is accidentally kicked out of its trap (by background gas collision or photon scattering), it represents a lost qubit. Atom has mitigated this by detecting and correcting loss events, and future systems may employ automated atom reloading to fill any vacancies between algorithm runs.
Finally, while running at room temperature is convenient, the vacuum chamber and high-power lasers needed for neutral-atom control are still complex hardware. As Atom scales up, integrating thousands of optical components and ensuring reliable trapping will require cutting-edge optical engineering. In summary, the strengths of Atom Computing’s neutral-atom modality lie in its scalability, connectivity, and qubit quality (long coherence, identical qubits), whereas the trade-offs involve control complexity, gate speed, and error sources distinct to atomic systems. Thus far, Atom has shown an ability to leverage the strengths (demonstrating record qubit counts and coherence) while steadily overcoming the challenges (boosting fidelities, implementing error correction for losses), giving credibility to neutral atoms as a competitive platform for quantum computing.
Track Record
Although Atom Computing is a relatively young company, its track record to date is marked by rapid execution and technical firsts. Founded in 2018 by a team with roots in atomic physics, the startup quickly attracted attention after it quietly built a 100-qubit prototype (“Phoenix”) by 2021. This accomplishment – 100 optically trapped atomic qubits with world-leading coherence – was achieved faster than any competitor had reached a similar scale at the time. The feat gave Atom an early reputation for speed and ingenuity in hardware development. In the ensuing years, the company has successfully leveraged partnerships and funding to maintain this momentum. It secured a $60 million Series B funding round in 2022, led by venture firms (Third Point, Prime Movers, Venrock, etc.), specifically to fuel the construction of its second-generation, larger-scale system. With a strong capital base and new engineering talent (including CEO Rob Hays, a former Intel executive, joining in 2021), Atom set up a performing R&D facility in Boulder, Colorado, where its 1,000+ qubit system was developed. By late 2023, Atom became the first company ever to announce a gate-based quantum computer exceeding 1,000 qubits, beating out efforts from far larger organizations. Achieving this “quantum milestone” earned recognition from industry analysts: “Founded just five years ago… [Atom] is going up against larger companies with more resources and holding its own,” observed Paul Smith-Goodson of Moor Insights, who called Atom’s singular focus on scaling qubits “highly impressive”.
Crucially, Atom Computing has not pursued scale at the expense of qubit quality – it simultaneously demonstrated record coherence and began tackling error correction, as noted. The company published peer-reviewed research (e.g. a Nature Communications paper in 2022 on 40-second coherence with nuclear-spin qubits) and routinely shares technical progress. Its collaboration with Microsoft, announced in 2023, further validates its credibility: Microsoft chose Atom’s hardware as a testbed for advancing logical qubits, and their joint results (24 logical qubits entangled) speak to Atom’s reliable hardware operation.
Additionally, Atom’s inclusion in DARPA’s high-profile quantum programs (US2QC and the follow-on QBI) underscores external confidence in its roadmap. “We are confident that Atom Computing’s technology and roadmap are on track with DARPA’s timeline for achieving utility-scale quantum computing,” founder Ben Bloom stated upon joining the DARPA initiative. Beyond government partnerships, Atom has been engaging with early adopters in academia and industry (for example, Vodafone’s research arm and Entropica Labs are quoted as partners) to run pilot applications and “reserve time” on its forthcoming machines. This indicates that Atom is already integrating its system into a broader ecosystem of quantum software and use-case development.
In terms of corporate growth, Atom Computing has expanded from its Bay Area origins to a presence in Colorado. In summary, Atom’s track record is characterized by bold engineering milestones (100 qubits, 1,000+ qubits, etc.), strong validation from funding, government, and industry partners, and a consistent focus on its end-goal of scalable, fault-tolerant computing. All these factors bolster the credibility of its ambitious roadmap.
Challenges
Despite its impressive progress, Atom Computing faces a number of challenges on the path to fully realizing its vision of a utility-scale, fault-tolerant quantum computer.
One fundamental challenge is the sheer scaling requirement: as noted, achieving quantum advantage for hard problems (and especially breaking cryptography) likely demands millions of physical qubits and thousands of logical qubits. Scaling any technology by three or four more orders of magnitude is a massive undertaking. Atom’s plan of ~10× qubit growth each generation is aggressive but will inevitably encounter diminishing returns or unforeseen bottlenecks as qubit counts rise. Engineering systems with 10,000 or 100,000 stable optical traps, and precisely controlling each, will push the state of the art in laser optics, vacuum stability, and automated calibration. The company will have to innovate in control electronics and software as well – coordinating and scheduling operations on so many qubits (and performing real-time error correction across them) could strain classical processing unless intelligent orchestration (and perhaps parallelization) is implemented.
Another key challenge is maintaining high fidelity and low error rates at scale. While Atom has shown 99%-plus gate fidelities on a small subset of qubits, it must ensure these performance levels hold across thousands of qubits simultaneously. Noise sources that are negligible at 100 qubits (e.g. laser phase noise, cross-talk between beams, fluctuating magnetic fields) might become significant when controlling 1,000 or more qubits in parallel. The error budget for fault-tolerant quantum computing is unforgiving – even slight increases in error rates can exponentially increase the overhead of error correction. Atom will need to keep improving the stability of its lasers and traps, possibly incorporating advanced techniques like better atomic cooling, dynamic decoupling, or error-transparent gate designs to keep physical error probabilities well below the threshold required by QEC codes. Qubit loss remains a unique challenge for neutral atoms: although mid-circuit correction of losses was demonstrated, prevention is preferable. As Atom scales up, ensuring an ultra-high vacuum (to minimize collisions) and perhaps implementing active atom replacement between algorithm rounds will be important to keep the effective error rate low. Researchers have suggested that continuous atom reloading and improved cooling could help address these limitations as systems grow larger.
There are also software and architecture challenges. Atom’s all-to-all connectivity is a strength, but effectively using it in algorithms requires smart compilers and scheduling – moving atoms or using Rydberg interactions optimally is non-trivial for complex circuits. Work is underway (e.g. new compilers like “Parallax” for neutral atoms) to exploit this flexibility without incurring too much overhead from atom transport. Moreover, designing effective error-correcting codes that fit well with neutral-atom hardware is an ongoing effort. The company will need to implement higher-distance logical qubits and perhaps experiment with different QEC schemes (beyond the entangled-cat-state approach used in the 24-logical-qubit demo) to find the most resource-efficient path to fault tolerance on their platform.
From a business perspective, Atom Computing also faces competitive and validation challenges. The quantum computing field is crowded with giants like IBM, Google, and Intel (pursuing superconducting or spin qubits), as well as other neutral-atom startups (Pasqal, QuEra) who have their own roadmaps. For example, Pasqal (France) has outlined a plan for 10,000 neutral-atom qubits by 2026, indicating Atom will not be alone in pushing this modality. Atom Computing must continue to differentiate itself by execution speed and technical benchmarks to stay ahead. Additionally, as with any breakthrough claims, there is external skepticism to overcome: for instance, upon the 1,180-qubit announcement, some observers noted that merely having thousands of atoms trapped is not the same as performing useful computations with them. Atom will need to demonstrate practical quantum performance metrics (such as high quantum volume or solving a classically hard problem) to convince doubters that its large systems are not just science experiments but genuinely programmable, high-fidelity computers. Delivering the planned >1,000-qubit commercial system to customers in 2025 will be a crucial proving ground – real users will expect robust operation, not just headline-grabbing qubit counts.
Lastly, reaching full fault tolerance and “utility-scale” computing will require navigating unknown theoretical and engineering territory. Atom’s participation in programs like DARPA QBI means it will be subject to rigorous milestones and independent evaluations, which raises the pressure to meet promised performance on schedule. The company will have to iterate rapidly, learn from each generation, and possibly contend with surprises (e.g. new error modes appearing at larger scales). All of these challenges are formidable, but not insurmountable. Atom Computing’s strategy of combining scaling with error correction R&D is a prudent one for tackling them – it ensures that improvements in quantity of qubits go hand-in-hand with quality of computation. If the company can continue to solve the technical hurdles of controlling ever-larger atomic arrays with high fidelity, and if it can demonstrate stepwise progress toward practical algorithms, it stands a good chance of overcoming these challenges and making history with one of the first fully fault-tolerant quantum computers.