Quantum Computing Companies

PsiQuantum

(This profile is one entry in my 2025 series on quantum hardware roadmaps and CRQC risk. For the cross‑vendor overview, filters, and links to all companies, see Quantum Hardware Companies and Roadmaps Comparison 2025.)

Introduction

PsiQuantum is a Silicon Valley-based startup taking a fundamentally different approach: photonic quantum computing. Their goal from the outset has been to build a large-scale, fault-tolerant quantum computer using photons, and they famously stated they need on the order of 1 million physical qubits (photons) for this and intend to achieve that by the late 2020s. While extremely secretive, some information has emerged through their partnerships and the DARPA program that give insight into their roadmap.

Milestones & Roadmap

PsiQuantum has not publicly released a detailed timeline like IBM or Google. Instead, they’ve shared a vision: using their unique photonic architecture, they plan to leap directly to a machine capable of error correction, rather than building many small NISQ computers. In 2021, they announced a partnership with GlobalFoundries to manufacture their silicon-photonic chips, indicating they were moving from theory/design to fabrication. By 2022, they had built some key components: single-photon sources, photonic switches, and waveguide-integrated detectors on chips. The company’s CTO, Pete Shadbolt, mentioned they had “entangled 6 photons” on a chip as an initial demonstration, and were steadily scaling that up. But exactly when they’ll have a full logical qubit, or a small universal processor, hasn’t been stated publicly.

A clue to their roadmap came from DARPA’s US2QC program (Underexplored Systems for Utility-Scale QC). In February 2025, DARPA selected PsiQuantum (along with Microsoft) to proceed to the final phase of this program, meaning DARPA’s experts vetted PsiQuantum’s plan for a “utility-scale” quantum computer and found it credible. The goal of US2QC/QBI is to see if an industrially useful quantum computer can be built by 2033. PsiQuantum’s acceptance into this implies that by 2033 (if not earlier) they aim to have a working large-scale machine. DARPA’s note says PsiQuantum’s approach is “silicon-based photonics… a lattice-like fabric of photonic qubits” with error correction. Phase 3 of the program will involve PsiQuantum building prototypes and a detailed system design that DARPA will evaluate. Unofficially, PsiQuantum has suggested it wants to have a commercial quantum computer (meaning fully error-corrected, solving big problems) by ~2027-2028. This ambitious timeline may have slid a bit (2028-2029 might be more realistic), but they are still targeting this decade for a full-scale machine.

One can piece together likely milestones: (1) Demonstration of a small fault-tolerant logical qubit (maybe within a cluster-state) – they might do this by 2025-26. (2) A prototype photonic quantum module that can perform a set of logic gates on, say, 50-100 photonic qubits to show scaling, by 2026-27. (3) Scaling up the number of modules and linking them via fiber, reaching thousands of physical qubits by 2027+. (4) Completing a million-qubit system with enough redundancy and error correction to do something like chemistry simulation or code-breaking by 2028-2030. These are speculative, but align with the fact that they continue to hire engineers for scaling assembly and have secured over $650M in funding to date, which is being spent on creating a huge photonic chip fabrication line.

Focus on Fault Tolerance

PsiQuantum’s entire approach is predicated on fault tolerance from the start. They are using a specific scheme called measurement-based quantum computing (MBQC) with photonic cluster states. This approach inherently builds in error correction: they plan to generate a gigantic entangled cluster of photons and then perform measurements that effectively implement logical gates. The cluster state can be made fault-tolerant if it’s built with the right graph structure (a 3D lattice cluster state can correspond to a surface code, for example).

In 2022, PsiQuantum researchers co-authored a paper on the theory of fusion-based quantum computing – essentially, a blueprint for assembling small photonic entangled states into a large fault-tolerant cluster using fusion gates (which are basically beam-splitter measurements that “fuse” photon clusters). This indicates that rather than build a device to run say Shor’s algorithm directly, they’re building a machine that continuously performs fusions to grow and maintain a massive encoded state. Fault tolerance will be achieved when the rate of errors in the photonic components is low enough and the redundancy is high enough that the cluster can correct faster than errors accumulate.

PsiQuantum often cites needing on the order of 108 to 109 physical gate operations (like fusions) per logical operation, hence the massive number of photons and devices. They chose photonics because it’s easier to replicate at huge scale: you can fabricate thousands of identical photon source/detector chips with standard manufacturing. Indeed, PsiQuantum’s partnership with GlobalFoundries aims to produce photonic chips with many thousands of components each; these chips will be tiled to create the full system. They’ve said the machine will likely occupy an entire data center floor – essentially a photonic supercomputer.

CRQC Implications

If PsiQuantum succeeds in building a million-photon quantum computer with full error correction, cryptographically relevant quantum computing is essentially achieved. Their design goal is exactly the scale one would need to run Shor’s algorithm on large numbers or Grover’s algorithm on large search spaces. For context, a million high-quality qubits could factor RSA-2048 as per recent estimates in perhaps a matter of days or less, and with further optimization maybe hours. PsiQuantum’s system, being photonic, might even run at higher clock speeds than matter-based qubits (potentially GHz-rate gating if they can detect and feed-forward quickly, though that’s speculative).

So in theory, PsiQuantum’s envisioned machine could break RSA and other cryptography almost as soon as it’s operational. The company’s messaging, however, is not about code-breaking – they speak more about revolutionizing chemistry, materials, etc. But one can assume that national security stakeholders are very interested: having a machine that could stealthily decrypt communications or validate new cryptographic systems would be a strategic asset.

It’s telling that China and others are also exploring photonic approaches (China’s Jiuzhang experiments are non-universal, but indicate an interest in photonic quantum computing for speed). PsiQuantum’s aggressive timeline (mid/late-2020s) is perhaps the closest threat to RSA-2048 if it holds true, because other approaches might take until the 2030s to reach similar scale. However, given the enormous technical risk in PsiQuantum’s plan, one should have healthy skepticism – they might hit delays or need more incremental steps. Nonetheless, from a CRQC perspective, PsiQuantum is one to watch as a possible leapfrogger.

Modality & Strengths/Trade-offs

PsiQuantum uses single photons as qubits, traveling through optical circuits on silicon chips.

Strengths: Photons do not decohere simply with time – a photon will maintain its quantum state (e.g., polarization or path mode) as long as it’s not absorbed. This means no need for extreme cooling for the qubits themselves (though single-photon detectors often need cryogenics). Photons can be manipulated at light-speed, and multiple photons can propagate in parallel through many channels – giving potential for massive parallelism. The fabrication leverages semiconductor industry processes, meaning in principle it’s easier to scale component count than, say, fabricating millions of Josephson junctions (photonic circuits are larger in size but easier to make in bulk with standard lithography).

Also, photonic qubits can be transmitted over distance with fiber, enabling a natural path to modular scaling: one can have many racks of photonic chips networked by fiber optics, essentially building a distributed quantum computer (something much harder to do with, e.g., superconducting qubits that can’t leave the fridge).

Another strength is the absence of idle error: in superconducting or trapped-ion qubits, even holding a qubit without operations can lead to errors (decoherence). Photons, by contrast, either move or are measured – you don’t store a photon for long (though you can delay them in loops of fiber); thus some error mechanisms are circumvented. PsiQuantum’s design also uses fusion-based gates which have the property of being probabilistic but heralded: if a fusion fails, you know it (detect a loss) and you can try again. This heralded approach means you only keep the results of successful operations, potentially reducing logical error rates (with time as the trade-off).

Trade-offs: The biggest is that photonic gates are inherently probabilistic when using linear optics. Two photons do not naturally interact; you need special processes (like using a nonlinear crystal or interfering them and conditioning on certain measurements). PsiQuantum’s approach uses interferometers and single-photon detectors to implement entangling gates – these only succeed with a certain probability (often 50% or less). To compensate, they incorporate a lot of redundancy (many attempts in parallel) and use very fast feed-forward to reconfigure circuits on the fly when a gate fails. This complexity is mind-boggling – essentially building a machine that can handle thousands of gate failures per second and still reliably compute.

Another major challenge is loss and error rates: every photon source has some probability of not producing a photon when needed; every waveguide has some loss per cm; every beam splitter has manufacturing imperfections; every detector has dark counts and inefficiency. When you multiply these probabilities across millions of events, the overall fidelity can crater. PsiQuantum is basically pushing the boundaries of photonic engineering to minimize loss and error at every juncture. For example, they developed high-efficiency superconducting nanowire single-photon detectors (SNSPDs) that can be integrated on chip, and ultra-bright deterministic photon sources from quantum dots or defect centers. But making those and integrating thousands of them with uniform performance is extremely challenging.

Another trade-off: resource overhead – their approach to error correction (like a 3D cluster state) might need, say, 50 physical photons for one logical qubit, repeated in time steps, so maybe 1000 photons per logical gate operation, etc. The sheer overhead is why they need a million photons to do what maybe 1000 perfect qubits could do. It’s a valid approach, but it means a lot of hardware for a given logical capability.

Finally, timing synchronization is a trade-off: photons must arrive and interfere within femtoseconds of each other timing-wise. This requires exquisite synchronization across a big machine – a tough but not impossible engineering feat (telecom industry handles picosecond timing on networks, for instance).

Track Record

PsiQuantum is tight-lipped, but a few things we know: They have produced a  “Q1” chip which is a 256-mode photonic circuit used to generate small cluster states, reportedly functional. They’ve demonstrated some basic 2-qubit algorithms with photons (like small quantum Fourier transform circuits) in lab experiments. They haven’t published many results in journals, likely to keep a competitive edge. Their strategy has been to partner (e.g., with GlobalFoundries in 2021, with Northrop Grumman for some packaging tech, and reportedly with EPB – a utility company – to eventually host a quantum datacenter).

Financially, they’ve raised around $650M and are rumored to be raising more, implying they are hitting internal milestones to justify continued investor confidence for a very expensive project. The DARPA selection in 2025 is a big external validation – essentially the US government saying this approach looks like it could work. That lends credibility.

However, since no public prototype exists yet, PsiQuantum is often met with healthy skepticism by others in the field. Some experts question if their timelines are realistic or if unforeseen physics (like background photon scattering or crosstalk in large optical switches) might derail things. It’s true that photonic quantum computing had a reputation for requiring too many resources, but PsiQuantum’s theoretical advances (like magic state cultivation to reduce distillation overhead) have cut down the requirements significantly. They’ve claimed a 20x reduction in qubit count needed compared to older estimates by using these new techniques. Their track record so far is more about R&D milestones than delivered devices: e.g., creating a single-photon source with certain purity, making a low-loss waveguide, etc. They’ve filed many patents in these areas, so there’s progress under the hood.

The real test of their track record will be if by, say, 2025-26 they can show a convincing small-scale integrated photonic quantum processor working. If they do, their approach will gain a lot of momentum.

Challenges

Virtually everything about PsiQuantum’s plan is challenging. They need near-perfect alignment and stability in optical components – even tiny temperature fluctuations can drift an interferometer out of phase. They’ll need sophisticated feedback systems to keep the whole optical network stable. The manufacturing yield of photonic chips with thousands of components is also a worry: if even 1% of components are dud, that could kill the device. They might have to build in redundancy or ways to route around defective parts, adding design complexity. Detector availability is a challenge too: they might need tens of thousands of single-photon detectors; currently, companies can produce maybe hundreds of SNSPDs for astronomy or research, but scaling to tens of thousands is being worked on (some integrated approaches put multiple detectors on one chip, which might help).

Another challenge is talent and time – photonic quantum computing is less explored than superconducting or ion traps; PsiQuantum had to pioneer many things from scratch. The timeline might stretch if, for instance, a certain loss target isn’t met and they have to redesign something fundamental. And like all ambitious projects, funding risk is there: if they need another billion dollars and economic conditions are bad, can they get it? That said, given the potential payoff (virtually unlimited computational power for some problems), there will likely be appetite to fund it as long as progress is demonstrated.

In short, PsiQuantum’s roadmap is perhaps the most high-risk/high-reward of all: they could be the first to the cryptography-breaking finish line or they could stall out if any part of the delicate photonic tower of cards falls apart. The coming few years will be crucial to see if their grand photonic gamble pays off.

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven consulting firm empowering organizations to seize quantum opportunities and proactively defend against quantum threats. A former quantum entrepreneur, I’ve previously served as a Fortune Global 500 CISO, CTO, Big 4 partner, and leader at Accenture and IBM. Throughout my career, I’ve specialized in managing emerging tech risks, building and leading innovation labs focused on quantum security, AI security, and cyber-kinetic risks for global corporations, governments, and defense agencies. I regularly share insights on quantum technologies and emerging-tech cybersecurity at PostQuantum.com.
Share via
Copy link
Powered by Social Snap