Quantum Computing Paradigms

Quantum Computing Paradigms: Photonic QC

(For other quantum computing paradigms and architectures, see Taxonomy of Quantum Computing: Paradigms & Architectures)

What It Is

Photonic quantum computing uses particles of light – photons – as qubits. Typically, the qubit is encoded in some degree of freedom of a single photon, such as its polarization (horizontal = |0⟩, vertical = |1⟩), or its presence/absence in a given mode (occupation number basis: no photon = |0⟩, one photon = |1⟩ in a mode), or time-bin (photon arriving early vs late). Photons are appealing qubits because they travel at the speed of light, have very low environmental interaction (hence can maintain coherence over long distances, which is why photons are used in quantum communication), and operate at room temperature. Optical quantum computing paradigms generally involve manipulating photons with beam splitters, phase shifters, and optical nonlinearities to enact quantum gates. However, photons do not naturally interact with each other (two photons can pass through each other without effect), which makes two-qubit gates challenging. The main approach to achieve effective interactions is to use measurement-induced nonlinearity: employing detectors and additional photons (ancilla) to create entanglement probabilistically, or using special materials where photons interact (like Rydberg atomic ensembles or Kerr media, though these are less developed). The field really took off after 2001, when the KLM protocol (Knill, Laflamme, Milburn) showed that scalable quantum computing with only linear optics and photon detection is possible in principle, albeit with high resource overhead​.

Key Academic Papers

A scheme for efficient quantum computation with linear optics by Knill, Laflamme, and Milburn (Nature, 2001) is the landmark paper proving that photons, even with only beam splitters and detectors (i.e. linear optics), can perform universal quantum computing given sufficient ancilla photons and single-photon sources​. This result was surprising because prior to that, it was thought you needed some nonlinear interaction (like an optical Kerr effect) to get two photons to entangle deterministically. KLM showed that by preparing certain entangled ancilla states and using post-selection (measuring some photons and based on the result, keeping the instances where a desired gate succeeded), one could implement an effective CNOT gate with probability <1, but arbitrarily close to 1 with enough resources​. This put photonics on the map as a viable approach.

Another key development was the concept of one-way quantum computing or measurement-based quantum computing introduced by Raussendorf and Briegel (PRL 2001): prepare a large entangled cluster of photons (a cluster state) and then perform a sequence of single-photon measurements to execute quantum logic​. This fits well with photonics, because generating entangled cluster states of photons (via down-conversion sources or quantum dot sources) and then measuring photons is something optics can do.

Since then, there have been many experimental papers demonstrating small-scale photonic quantum gates (e.g. a photonic CNOT gate was demonstrated in 2003 by O’Brien et al., using two photons and interferometers, showing truth-table of a CNOT with ~0.81 fidelity). Quantum teleportation with photons was shown as early as 1997 (Bouwmeester et al.), which is essentially a three-photon entanglement operation. In recent years, quantum photonic chips (integrated waveguide circuits) have been used to generate and manipulate up to several dozen photonic qubits in special states (notably, Jiuzhang photonic experiment in 2020 from USTC in China demonstrated boson sampling with 76 photons, a task not tractable for classical simulation, claiming a form of “quantum advantage” with photonics). A company PsiQuantum is heavily investing in photonic cluster-state quantum computing, aiming for a million-photon-scale system.

How It Works

There are two main paradigms for photonic QC: discrete-variable (DV) photonics, where qubits are individual photons (with polarization or path encoding), and continuous-variable (CV) photonics, where quantum information is encoded in modes of the electromagnetic field (like in phase-space variables of squeezed states). DV photonics is more analogous to qubits, CV uses qumodes and often uses Gaussian states and measurements. We focus on DV here as it aligns with “qubits”.

In a typical photonic circuit, one needs sources that emit single photons on demand in a particular quantum state (for instance, a quantum dot device or a parametric down-conversion source that produces entangled photon pairs). These photons travel through an optical network of interferometers (beam splitters, phase shifters, waveguides). Linear optical elements by themselves cannot entangle two independent photons unless conditional measurements are done because of the lack of nonlinear interaction. However, by sending photons into a beam splitter and using quantum interference (Hong-Ou-Mandel interference effect, where two identical photons entering a beam splitter will bunch together exiting in the same output with certain probability), one can get entangled outcomes conditioned on certain measurements.

One straightforward approach to a photonic CNOT is the following: interfere the control and target photons with additional ancilla photons on a network of beam splitters such that, upon detecting ancilla in certain detector outcomes, the remaining two photons emerge entangled as if a CNOT happened. If the detectors click in an unexpected pattern, the gate failed (which means one has to try again). KLM protocol provided specific interferometer setups for this. The success probability might be low (e.g. 1/16 for a basic approach), but using more ancillas can boost it.

Cluster-state / one-way computing: Rather than do gates one by one with low probability, another strategy is to pre-generate a highly entangled resource state. For example, a 2D cluster state of photons (which could be visualized as a large entangled web). Once you have that cluster, you can perform single-photon measurements in certain bases to effectively enact logical gates. The unmeasured part of the cluster acts as the evolving quantum state of the computation, and measurements reduce the cluster. This is called one-way quantum computing because the cluster is consumed by measurements (non-unitary, hence “one-way”). Photonics is great for this because one can attempt to generate cluster states via sources that emit multiple entangled photons at once or by entangling photons from successive emissions. PsiQuantum’s approach, for instance, is to use silicon photonic circuits and create a massive cluster state via fusion measurements (entangling smaller clusters into bigger ones)​. The advantage is that the “gates” become just measurements, which are relatively easy (detecting photons with single-photon detectors). The challenge shifts to reliably producing a huge cluster of entangled photons (millions of photons with a specific entanglement graph) and doing so faster than photons are lost.

There’s also the continuous-variable approach using squeezed light: In this paradigm, rather than single-photon qubits, one uses squeezed vacuum states in optical modes to create large cluster states of modes. Gaussian operations (beam splitters, phase shifts) are easy on these, and one can achieve certain quantum computations by injecting non-Gaussian elements (like a photon-counting measurement at some stage). Xanadu (a company) focuses on this with their Boson Sampling and Gaussian quantum computing using squeezed light and threshold detectors.

Key components in photonic QC:

  • Feed-forward electronics: since many schemes require detecting an ancilla then applying a correction to other photons (like if a measurement yields one result, one might need to apply a Pauli X on another photon to complete the gate), one needs low-latency feed-forward. In an optical system, that might involve fast switching or phase shifting based on detector signals, typically in nanoseconds if all in fiber or chip.
  • Single-photon sources: ideally on-demand and identical photons. This can be spontaneous parametric down-conversion (SPDC) in nonlinear crystals (which produce pairs of photons randomly, then one can herald one photon by detecting the other), or quantum emitters (quantum dots in semiconductors that can emit single photons when excited). Quantum dot sources have shown high purity single photons at high rates recently.
  • Linear optical network: integrated photonic chips with waveguides and beam splitters (Mach-Zehnder interferometers) can route and mix photons with precise phase control. These are often reconfigurable via thermo-optic or electro-optic phase shifters.
  • Photon detectors: single-photon detectors (like silicon avalanche photodiodes for visible/near-IR, superconducting nanowire detectors for telecom wavelengths) that can detect individual photons with high efficiency. They often only tell if ≥1 photon is present, not how many (though transition-edge sensors can resolve number).

Comparison To Other Paradigms

Photonic quantum computing is unique in that qubits are flying particles – you don’t hold a photon in place easily (unless you trap it in a cavity, which typically turns it into something akin to a stationary qubit temporarily). Most other paradigms have “stationary” qubits (atoms in a trap, electrons in a solid, etc.). Photons being mobile is great for communication: they are naturally suited for connecting distant nodes (hence quantum networks and QKD rely on photons). For computing, it means that implementing something like a quantum memory or long computation requires either looping photons around (delay lines) or converting flying qubits to stationary form at times. Some proposed architectures use optical fiber delay loops to store photons while others progress.

A massive plus is room-temperature operation – photonic circuits and detectors can operate at or near room temp (with exceptions: the best detectors, superconducting nanowires, need cryo cooling to ~2.7K, but future on-chip SPADs may improve). Also, photons don’t couple strongly to noise sources – no need for extreme isolation from magnetic fields, etc. The biggest error source is loss – photons can get absorbed or not detected (detector inefficiency). Loss acts a bit like an erasure error (you know a photon lost because detector didn’t click when it should have). Too much loss kills the computation because you lose track of qubits. But in principle, one can design photonic systems with very low loss and use redundancy to tolerate some loss.

Another benefit: Potentially easier scaling in components – making an optical waveguide chip with hundreds of beam splitters is something already done in photonic integrated circuits for telecom. So one can leverage photonic chip fabrication. Companies like PsiQuantum are using standard silicon photonics (300mm wafers, etc.) to make their devices, promising faster scaling once fundamental performance is achieved. Photons don’t interact, so each optical component is quite independent and doesn’t disturb others, meaning you can pack a lot on a chip (subject to loss and phase stability). In contrast, superconducting or spin qubits on a chip can cross-talk through microwave fields or substrate phonons etc., making isolation tricky at large scale.

However, the probabilistic nature is a major drawback: Without a nonlinear medium, two-photon gates are not deterministic. KLM showed it’s possible with high overhead. In practical terms, this means you need either a very large overhead (lots of attempts in parallel to use multiplexing to ensure, say, a gate can be had on demand) or a way to get effective nonlinearities. One promising path is using matter systems to mediate photon-photon interactions: e.g., an atom in a cavity can act as a small quantum gate between a photon and the atom, thereby entangling photons. Another is Rydberg atomic ensembles that cause two photons to acquire a phase if present simultaneously (Rydberg blockade for photons, being researched to get deterministic photonic gates). But these hybrid approaches verge away from pure photonics.

Current Development Status

Photonic quantum computing has seen significant experimental progress but is not yet at the scale of executing arbitrary algorithms with many qubits. We have seen:

  • Boson Sampling experiments: These are specialized photonic experiments (not universal computing, but a quantum advantage demonstration candidate) where many indistinguishable photons pass through a large interferometer network and one samples the output distribution. In 2020, USTC’s Jiuzhang experiment achieved 76-photon boson sampling (a task intractable for classical simulation with normal resources)​. In 2022, they updated to Jiuzhang 2.0 with more photons. Xanadu’s Borealis photonic quantum computer (a programmable Gaussian boson sampler) used 216 squeezed modes to sample, claiming a quantum advantage as well (they had ~ borealis: 219-mode interferometer, effectively ~129 photons detected on average). These are not general computers but show photonics handling large numbers of mode-qubits (modes in which photons could be).
  • Small quantum circuits: University of Bristol, MIT, etc., demonstrated few-qubit photonic chips that run basic algorithms (like a 2-qubit quantum Fourier transform, 3-qubit GHZ state preparation, etc.) in the late 2000s and 2010s. There have been demonstrations of up to 8 or 12 photonic qubits entangled (like 8-photon entangled state by USTC in 2016) using bulk optics. Integrated chips have seen ~4 qubits entangled on-chip.
  • Cluster state generation: Experiments have generated small photonic cluster states (e.g. 4-photon linear cluster by Cambridge, 6-photon graph states by others). Very recently (2022), a team generated a 12-photon 2D cluster state on a chip. Also, time-multiplexing approaches have created cluster states in time domain: e.g. University of Geneva created a large time-bin cluster state using a single source and loop delays, reaching hundreds of entangled temporal modes (though each mode was a qubit? I’d have to recall specifics).
  • Companies:
    • PsiQuantum: pursuing a fault-tolerant quantum computer with photonics. They have kept a low profile on experimental achievements but claim to be on track to build a million-qubit photonic cluster-state machine by combining many photonic chips and thousands of detectors, etc. They partner with GlobalFoundries for photonic chip fabrication. They likely have demonstrated some key building blocks (like small cluster generation, perhaps integrated single-photon sources).
    • Xanadu: based in Toronto, working on a photonic quantum computing platform using continuous variables (squeezed light) and some single-photon detection for non-Gaussian operations. They offer a cloud “Quantum Processing Unit” called Borealis that performs Gaussian boson sampling, which had 216 modes and was used to show quantum advantage in 2022​. They are also developing fault-tolerant schemes (they propose encoding qubits in GKP states of oscillators, which could be generated in their system).
    • QuiX, ORCA Computing, Quandela: various startups focusing on photonic processors or simulators, though not all aiming for universal QC. Quandela (France) offers a small photonic computer “Perceval” with 2-3 qubits for cloud use and working on cluster state approach as well.
    • Intel and others invest in silicon photonics research which indirectly supports these efforts (e.g. Intel demonstrated a silicon photonic chip with a quantum dot single-photon source integrated, etc.).

Currently, photonic devices haven’t run something like Shor’s algorithm on meaningful numbers because scaling is an issue. But they are leading in terms of raw number of modes/qubits in some non-universal tasks (like boson sampling). The development status is thus: experimental and in early prototype – no general-purpose photonic NISQ processor akin to IBM’s or IonQ’s machines yet, but substantial progress in special-purpose photonic computing and components.

Advantages

Room-temperature and potentially mass-producible: Photonic chips can be made in semiconductor fabs from silicon, silicon nitride, etc., leveraging decades of photonics industry advances. Once the design is proven, scaling manufacturing is more straightforward relative to, say, fabricating millions of Josephson junctions which may face materials challenges. Photonic systems can operate at room temperature (except detectors if using SNSPDs, though there are SPADs at slightly lower performance that work at lower temp). No need for extreme cooling or vacuum.

High speed and parallelism: Photons naturally propagate quickly, and one can potentially have many photons flying in parallel through different channels of an optical chip (massively parallel operations). Detector and laser repetition rates in the GHz range are feasible, meaning high clock speed for certain operations (though feed-forward currently can be a bottleneck if electronic). The entire computation in one-way model is just a network of beam splitters (happening at the speed of light through the chip) and then detectors firing – it could be very fast in principle.

Long coherence length: Photons essentially don’t decohere in the usual sense (a photon will maintain its polarization or path superposition over long distances, limited mainly by phase stability of path length). This is great for quantum communication and also means one can have physically large systems (the size of an optical table or even between labs) and still maintain entanglement, which is not possible with say superconducting qubits that must be kept tiny on a chip to maintain coherence.

Natural for networking: If modular quantum computing is envisioned, photonic interconnects are the best way to link distant quantum modules (even in other paradigms). So even if photonics isn’t the main compute paradigm, it’s indispensable for connecting quantum systems. But here, photonics itself is the system.

Certain tasks like simulation of photonic processes or boson sampling are uniquely suited: Photonics can realize high-dimensional Hilbert spaces (each photon mode adds dimension, and multiple photons create rich quantum states). For example, boson sampling takes advantage of that to do something that’s hard classically. So photonic processors might have niches where they outperform others even at small scale (though those niches might not directly solve commercial problems yet).

Disadvantages

Probabilistic gates and resource overhead: As discussed, without deterministic photon-photon interaction, one has to consume many photons to get one logical gate. For fault-tolerant operation, estimates show photonic approaches need a very large number of physical photons and high source rates (PsiQuantum often cites needing on the order of $10^8$ photons per second generation rates and extremely low loss to do error correction). This overhead is like having a very high error-correction blow-up at the physical level in exchange for the lack of physical two-qubit gates. It’s a daunting engineering challenge – requiring thousands of high-efficiency sources and detectors integrated.

Loss and fault tolerance demands: Every optical component has some loss; fiber has loss per km, waveguides have coupling loss, detectors aren’t 100% efficient. When you have many photons undergoing many steps, the probability all photons survive drops exponentially if loss isn’t extremely low. This is why fault tolerance threshold for loss in photonic computing might require e.g. <1% loss per optical element and sources/detectors >99% efficiency. Achieving that consistently in a large system is tough.

Noisy or slow feed-forward: Many photonic experiments are essentially one-shot – they measure something at the end. But for a full computer, you sometimes need to measure one photon and quickly act on another that is still in flight. Electronics to do that (detect, process, reconfigure another photonic element) takes time – typically at least tens of nanoseconds even with fast electronics, which means the other photon might need to be delayed (in a fiber loop for example) to wait for the signal. This adds complexity and can introduce more loss. Alternatively, some schemes avoid real-time feed-forward by clever encodings but often at cost of more overhead.

Scaling sources and detectors: Having say a million photons involved means you need million detector events or near that. Currently, arrays of many photon detectors are not common (some projects build SNSPD arrays of maybe 64 or 100, but not thousands yet). Also, running them at high rates could cause detector saturation or dead-time issues. On the source side, quantum dot single-photon sources are improving, but combining dozens or hundreds of them with identical spectra and timing is a research problem. SPDC can produce many photons but is probabilistic and inefficient for large scale because most pulses have 0 or multiple photons (hard to get exactly one reliably without losing many events).

Quantum error correction is non-trivial: While in theory one can error-correct photon losses or gate errors by encoding in redundancy (e.g. bosonic codes or logical cluster states), doing parity checks in photonics is itself challenging (you need more photons and gates to detect errors, but those gates are probabilistic… a bootstrap problem). It’s being researched (e.g. using bosonic qubits like GKP states as logical qubits that are more robust to loss), but still at early stage.

Impact On Cybersecurity

A full-scale photonic quantum computer (fault-tolerant) could absolutely run Shor’s algorithm and break encryption like any other. In fact, photonics has an interesting double role in cryptography: it underpins quantum cryptography (QKD) which provides quantum-resistant secure communication by distributing keys via photons. So one could envision a future where photons are used both to break classical crypto (by running quantum algorithms) and to secure communications (via QKD) – a bit of a race. However, currently photonic quantum computing is not at the stage of factorizing large numbers or anything; it’s mostly doing specialized tasks or small experiments. If PsiQuantum’s plan succeeds, they aim for a million-qubit machine in the late 2020s or early 2030s that could indeed be used for large-scale algorithms (they specifically talk about chemistry and error-corrected algorithms as goals). That would certainly be powerful enough for cryptanalysis. But it’s a big “if” – many technical milestones must be reached. In the meantime, photonics is more a helper to other paradigms’ cryptographic applications: e.g., photon-based quantum repeaters could secure networks against the threat of quantum code-breaking by enabling QKD widely, serving as a mitigation before quantum computers come online.

One thing to note: Quantum key distribution (QKD) uses single photons to securely share keys – it is not a quantum computer but a communications protocol. It is already commercial. While not directly related to “computing paradigms”, it’s a photonic quantum tech that is relevant to cybersecurity: it’s a defense against the quantum threat, leveraging the one thing quantum computers cannot break – the laws of physics – to distribute keys. China famously has a quantum satellite “Micius” that demonstrated QKD between continents in 2017. Many telecom providers are testing QKD for high-security links (banks, data centers, etc.). So photons are ironically both the key to breaking certain cryptography (if used in a QC) and the key to making encryption schemes that even a quantum computer can’t eavesdrop on (QKD).

Future Outlook

Photonic quantum computing is often cited as high-risk, high-reward. If technical issues (like high-efficiency sources, integrated low-loss circuits, massive detector arrays) can be solved, a photonic system could potentially leapfrog others because of the relative ease of manufacturing scale. PsiQuantum’s vision is to build a million-qubit machine largely by tiling many photonic chips and fiber linking them – something that could be plausible using existing semiconductor fabrication scale, rather than needing each qubit to be individually calibrated in a dilution fridge. We may see within the next 5 years the first small photonic computers that can perform say a few logical qubits of error-corrected computation (perhaps using a cluster-state approach with a moderate cluster of a few thousand physical qubits to get a couple logical qubits). If that happens, it will validate the paradigm. If not, photonics might remain more in the realm of specialized demonstrators (like boson sampling machines which might serve as physical random generators or for simulating certain quantum chemistry vibronic spectra – one application of boson sampling is finding vibrational spectra of molecules).

In parallel, photonic integration with other qubit types is likely to grow. For example, trapped-ion and superconducting systems use photons to communicate between cryogenic/ion modules; Nitrogen-vacancy centers in diamond (solid-state spin qubits) use photons to entangle separate NVs. So photonics is also the bridge between distant qubits in other paradigms. The distinction between “computation” and “communication” might blur: maybe one design is a hybrid where static qubits do memory and logic, and photons do communication between them – a distributed quantum computer. The topological photonics or Kerr-mediated photonics might also see breakthroughs: if one could get a chip with an on-demand photon-photon interaction (like via a quantum nonlinear material or an efficient photon-mediated entangler), that would dramatically reduce overhead and make photonic circuits more like conventional circuits (just optical). Some research explores using e.g. Rydberg atoms in a glass cell to interact photons for a gate – if that can be integrated, it’s a game-changer.

By the late 2020s, we will likely know if photonics can contend in the fault-tolerance race. If yes, photonic quantum computers might become the leaders due to scaling advantages. If challenges persist, photonics may pivot to being more about quantum interconnects and simulators rather than general computers. Nonetheless, given the heavy investment (PsiQuantum has raised huge funding, Xanadu as well albeit smaller), there is strong momentum in this paradigm. For cybersecurity, a successful photonic quantum computer would mean the ability to break encryption not just rests with dilution-fridge labs but possibly with network-integrated devices – emphasizing the need for post-quantum cryptography standards to be in place well before such machines come online.

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven professional services firm dedicated to helping organizations unlock the transformative power of quantum technologies. Alongside leading its specialized service, Secure Quantum (SecureQuantum.com)—focused on quantum resilience and post-quantum cryptography—I also invest in cutting-edge quantum ventures through Quantum.Partners. Currently, I’m completing a PhD in Quantum Computing and authoring an upcoming book “Practical Quantum Resistance” (QuantumResistance.com) while regularly sharing news and insights on quantum computing and quantum security at PostQuantum.com. I’m primarily a cybersecurity and tech risk expert with more than three decades of experience, particularly in critical infrastructure cyber protection. That focus drew me into quantum computing in the early 2000s, and I’ve been captivated by its opportunities and risks ever since. So my experience in quantum tech stretches back decades, having previously founded Boston Photonics and PQ Defense where I engaged in quantum-related R&D well before the field’s mainstream emergence. Today, with quantum computing finally on the horizon, I’ve returned to a 100% focus on quantum technology and its associated risks—drawing on my quantum and AI background, decades of cybersecurity expertise, and experience overseeing major technology transformations—all to help organizations and nations safeguard themselves against quantum threats and capitalize on quantum-driven opportunities.
Share via
Copy link
Powered by Social Snap