Quantum Computing Paradigms

Quantum Computing Paradigms: Photonic Continuous-Variable QC (CVQC)

(For other quantum computing paradigms and architectures, see Taxonomy of Quantum Computing: Paradigms & Architectures)

What It Is

Continuous-variable quantum computing encodes quantum information in continuous degrees of freedom, such as the quadratures of light (analogous to position and momentum), rather than binary quantum bits​​. In a photonic CV system, a quantum mode of the electromagnetic field (often called a qumode) carries information in properties like the amplitude and phase of light. Measuring such a mode can yield a continuous range of outcomes (any real value within some range), unlike a qubit measurement which yields a 0 or 1​. Each photonic mode corresponds to an oscillator with an infinite-dimensional Hilbert space, meaning it can theoretically hold more information than a two-dimensional qubit.

Photonic systems are a promising platform for CVQC because quantum states of light are relatively accessible and controllable using well-established tools of quantum optics. Lasers can produce coherent states of light (think of a minimal quantum version of a classical electromagnetic wave), and nonlinear optical devices can produce squeezed states, which have reduced quantum uncertainty in one quadrature at the expense of increased uncertainty in the conjugate quadrature. Squeezed light is a key resource for CV quantum information processing, as it enables entanglement between modes and the creation of Gaussian quantum states (states whose Wigner function is Gaussian-shaped)​. In fact, many quantum optics experiments – like the generation of squeezed vacuum or entangled light beams – serve as the foundation for CVQC. Crucially, optical Gaussian operations (those that preserve the Gaussian nature of states, such as beam splitters, phase shifts, and squeezers) are readily implemented with standard optical components​. This makes photonics attractive: we can entangle dozens or even millions of modes of light deterministically using linear optics and squeezers​​, something that would be extremely challenging with discrete qubit systems.

In summary, photonic CVQC uses continuous quantum variables (like field amplitudes) to represent and process information. It leverages the mature field of quantum optics – lasers, beam splitters, nonlinear crystals, homodyne detectors – to prepare and manipulate quantum states. This continuous-variable approach is “analog” in spirit (information is encoded in continuous values), and it stands as a powerful alternative to the more common qubit-based (“digital”) quantum computing paradigm​. The next sections delve into the theory and practice of CVQC, citing key literature and experiments that have shaped the field.

Key Academic Papers

Several influential papers have established and advanced the concepts of continuous-variable quantum computing. Below is a summary of key academic works in the field, along with their main contributions:

  • Lloyd & Braunstein (1999)Quantum Computation over Continuous Variables.” This seminal paper proved that quantum computing is possible using continuous variables, and it identified what’s needed for universality. The authors showed that a set of operations using linear optics (beam splitters and phase shifters), squeezers, and at least one nonlinear element (like a Kerr effect or equivalent) is sufficient for universal quantum computation with continuous quantum variables​. This work essentially launched CV quantum computing as a theoretical possibility.
  • Gottesman, Kitaev & Preskill 2001)Encoding a qubit in an oscillator.” Commonly known as the GKP paper, this introduced a revolutionary quantum error-correcting code for oscillators (continuous variables)​. They proposed encoding a logical qubit into a harmonic oscillator’s continuous phase space using a grid of squeezed states. The paper showed that with this encoding, one could perform fault-tolerant universal quantum computation using only Gaussian operations (linear optics, squeezers) plus simple measurements, provided the special encoded states (now called GKP states) can be prepared​. This gave a blueprint for error correction in CV systems and connected continuous variables with the standard qubit error-correction framework.
  • Braunstein & van Loock (2005)Quantum information with continuous variables.” This was an early comprehensive review (spanning quantum communication and computing) that consolidated knowledge of CV quantum information. It underscored that continuous-variable carriers are a powerful alternative to qubits and surveyed experimental progress up to that point​. This review, along with a later one by Weedbrook et al. (2012)​, became standard references for newcomers to the field.
  • Menicucci et al. (2006)Universal Quantum Computation with Continuous-Variable Cluster States.” Nicolas Menicucci and colleagues extended the cluster-state model (one-way quantum computing) to continuous variables​. They proposed that one can generate a large entangled Gaussian state of many optical modes (a CV cluster state) using squeezed light and beam splitters, and then perform computations via sequential measurements on the modes. Importantly, they showed that Gaussian operations alone are not sufficient for universality – one must include at least one non-Gaussian element, such as a single-mode non-Gaussian measurement, to achieve a universal gate set​. This paper laid the groundwork for measurement-based CV quantum computing, which has become the leading experimental approach in photonic CVQC.
  • Menicucci (2014)Fault-Tolerant Measurement-Based Quantum Computing with Continuous-Variable Cluster States.” This work addressed the critical question of fault tolerance in the CV one-way computing scheme. It determined a threshold on the initial squeezing: about 20.5 dB of squeezing is required in the cluster-state generation so that, when combined with qubit error-correcting codes, error rates fall below the fault-tolerance threshold​. In other words, if each mode of the cluster state has squeezing above this threshold, one can in principle perform arbitrarily long quantum computations by concatenating with known qubit codes​. This result, later improved by other architectures, set a concrete (if challenging) target for experimentalists.
  • Larsen et al. (2021)Fault-Tolerant Continuous-Variable Measurement-based Quantum Computation Architecture.” (One representative of several recent papers on fault tolerance.) This work combined CV cluster states with the GKP error-correcting code to design a fault-tolerant architecture. By using three-dimensional cluster states and embedding GKP error correction, the authors reduced the squeezing threshold for fault tolerance to about 12.7 dB​ – a significant improvement over the earlier 20.5 dB estimate. It shows how hybrid continuous-discrete techniques can lower the experimental requirements for a full-scale CV quantum computer​.
  • Asavanant et al. (Science 2019) and Larsen et al. (2019)Deterministic generation of a two-dimensional cluster state.” These two simultaneous experiments (one in Japan​ and one in Denmark) demonstrated the creation of massive entangled cluster states of light using time-domain multiplexing. Asavanant et al. reported a 2D cluster state in a 5×1240 mode lattice structure​, and Larsen et al. achieved a similar feat. These cluster states involve thousands to over a million entangled optical modes, a scale far beyond what has been entangled in discrete-variable systems. Notably, the generated 2D clusters are compatible with implementing quantum error correction codes (like GKP states) for fault tolerance​. These papers marked a major experimental milestone toward scalable CVQC.
  • Madsen et al. (2022)Quantum computational advantage with a programmable photonic processor.” This paper (from Xanadu) unveiled Borealis, a photonic quantum computer that achieved quantum computational advantage using Gaussian boson sampling​. Borealis used 216 squeezed optical modes sent through a dynamically programmable interferometer, with results detected via photon-number-resolving detectors​. The experiment showed that Borealis could generate samples from a complex distribution in about 36 microseconds, whereas the best classical simulation would take an estimated 9,000 years​. While focused on a specific task (boson sampling) rather than general-purpose computation, this work is a landmark demonstration of the computational power of photonic CV systems at scale.

These papers (among others) form the backbone of photonic CVQC literature. They established the theoretical framework for CV universality and error correction, proposed practical models for implementation​, and achieved record-breaking experimental demonstrations of entangled states​ and quantum advantage​. Together, they highlight the rapid progress and unique strengths of the CV approach in quantum computing.

How It Works

In continuous-variable quantum computing, information is processed by manipulating the quantum state of oscillators – for photonic systems, these oscillators are modes of the electromagnetic field (e.g. specific frequencies or temporal modes of light). Understanding how CVQC works involves a few key concepts:

  • Quantum Modes (Qumodes) and Quadratures: A single mode of light can be treated like a quantum harmonic oscillator with two observable parameters called quadratures. These are typically labeled $\hat{x}$ (position or amplitude quadrature) and $\hat{p}$ (momentum or phase quadrature), and they satisfy the canonical uncertainty relation (you cannot sharply define both, akin to position/momentum for a particle). In a coherent state (the quantum state closest to a classical laser state), $\hat{x}$ and $\hat{p}$ have certain Gaussian uncertainties. One can think of a qumode as analogous to a qubit, but instead of a Bloch sphere of finite states, the qumode has an infinite continuum of basis states (often taken to be the eigenstates of $\hat{x}$ or $\hat{p}$). Operations on qumodes will generally be continuous transformations in this infinite-dimensional state space.
  • Squeezed States and Gaussian Operations: A fundamental resource for CVQC is the squeezed state. Squeezing is an operation that reduces the uncertainty (variance) of one quadrature below the vacuum level at the expense of increased uncertainty in the conjugate quadrature. A perfectly infinitely-squeezed state would have no uncertainty in one quadrature – effectively giving a well-defined continuous variable, which could serve as an idealized logic state. In practice, one achieves finite squeezing (common experiments produce 5–15 dB of squeezing, where higher dB means closer to the ideal). Squeezed states, when combined via linear optical elements like beam splitters, can produce entangled Gaussian states. For example, interfering two single-mode squeezed states on a beam splitter yields a two-mode squeezed state, which is an entangled state of two modes (a building block for cluster states). Because the operations involved (squeezing + beam splitting) are Gaussian, the resulting state has a Gaussian Wigner function. Gaussian operations include: phase rotations, displacements (shifting the mean of a quadrature), beam splitters, and squeezers. These operations are relatively easy to implement with optical devices (phase shifters, mirrors, electro-optic modulators, nonlinear crystals, etc.) and they suffice to create large entangled resources. In fact, the gigantic cluster states mentioned above were generated using only Gaussian operations on many squeezed pulse modes​.
  • Gaussian vs. Non-Gaussian States: While Gaussian states/operations are convenient, a crucial theoretical insight is that Gaussian operations alone cannot realize universal quantum computing. This is because Gaussian processes can be efficiently simulated on a classical computer (they correspond essentially to linear dynamics of quadrature variables, akin to classical noise distributions). To unlock the full power of quantum computing, non-Gaussian elements must enter the picture​. Non-Gaussian states include having single photons, photon-number states, or more exotic states like the GKP grid states; non-Gaussian operations include things like photon-counting measurements or the application of a cubic phase gate (a specific unitary operation with a third-order nonlinearity in quadrature). The cubic phase gate, for instance, has a transformation $\hat{U}_{\text{cubic}} = \exp(i \gamma \hat{x}^3)$ which is not Gaussian. Implementing such a gate deterministically is difficult because it requires strong optical nonlinearities that are not readily available. Instead, an equivalent effect can be achieved by off-line preparing a non-Gaussian resource state (like a specially shaped ancilla state) and then using measurement-induced gating. For example, one could prepare a state that approximates a cubic phase state and then use quantum teleportation: by teleporting an unknown state through this ancilla with appropriate measurements, the cubic phase operation is imparted on the state. Similarly, photon counting (a non-Gaussian measurement) on part of an entangled state can induce an effective non-Gaussian operation on another mode. The key takeaway is that a hybrid of Gaussian operations + at least one non-Gaussian resource yields a universal gate set.
  • Gate Operations in CVQC: How does one perform logic gates or algorithms with continuous variables? Some gates have direct analogs: for instance, a SUM gate (or CV CNOT) can be defined as an operation that shifts the $\hat{x}$ quadrature of one mode by an amount proportional to the $\hat{x}$ of another mode. This can be realized by coupling two modes on a beam splitter and using feed-forward of measurement results. In one-way (cluster-state) quantum computing, gates are applied by measurements: if you have a pre-entangled cluster of modes, performing a homodyne measurement (measuring a quadrature) on one mode will project the remainder of the cluster into a state where an effective gate has been applied to the other modes, depending on the measurement basis and outcome. By choosing measurement settings adaptively (and using classical feed-forward to adjust outcomes), one can implement an arbitrary Gaussian transformation on the remaining modes of a cluster state​. Non-Gaussian gates are implemented by injecting non-Gaussian ancillas into the cluster or by using non-Gaussian measurements at certain steps. In a gate-based (circuit) model, one would sequentially apply gates like displacements, rotations, squeezers (Gaussian) and occasionally a non-Gaussian gate (like the cubic phase). But experimentally, the measurement-based model is often favored: you create a large entangled state (like a CV cluster), then perform a sequence of measurements to realize a given quantum circuit. The computing “hardware” is the entangled state plus flexible detectors, rather than a sequence of physical two-mode gates applied in series.
  • Quantum Error Correction in CV: Continuous-variable systems are prone to specific errors, like noise and losses that cause small random shifts in quadratures. Interestingly, the GKP code introduced earlier provides a way to correct such shift errors by encoding a qubit into a comb-like superposition of oscillator eigenstates. In principle, if one can initialize GKP code states (which are non-Gaussian) inside a CV quantum computer, one can use syndrome measurements (e.g., measuring photon number mod some integer) to detect and correct small shifts in $\hat{x}$ or $\hat{p}$. Fault-tolerant CVQC schemes often envision using CV cluster states as a backbone, with GKP states embedded to provide error correction​. Operationally, this might mean teleporting GKP-encoded states through a cluster and using the redundancy of the encoding to fix errors on the fly. While this is complex, it is analogous to how discrete quantum computers would use surface codes – except here the redundancy is in an analog space of an oscillator.

In essence, CVQC with photonic modes works by exploiting the rich, continuous quantum dynamics of light. Squeezers and beam splitters entangle many modes into a resource state, homodyne measurements (measuring quadratures) perform Gaussian transformations easily, and occasional use of photon-counting or specially prepared states injects the necessary nonlinearity for universal computing​. The entire process can happen at the speed of light – literally – as pulses pass through optical networks and are measured by fast photodetectors. This paradigm is powerful, but it also comes with significant challenges (especially the generation of non-Gaussian resources and error correction), which we will discuss later.

Comparison to Other Paradigms

Continuous-variable quantum computing has some clear differences from the more familiar discrete-variable (qubit-based) quantum computing. Here we compare CVQC to other leading quantum computing paradigms:

  • Discrete-Variable (qubit-based) Quantum Computing: In DV quantum computing, information is stored in qubits that yield discrete outcomes (0 or 1) when measured​. Examples include superconducting qubits, trapped ion qubits, spin qubits, or even single-photon dual-rail qubits. The CV approach, by contrast, stores information in analog quantum variables, giving a continuous range of outcomes​. Both CV and DV models are theoretically equivalent in computational power (each can simulate the other given enough resources)​. However, their implementations differ. CV qumodes can carry more information per physical system (since a mode’s state is an infinite-dimensional vector, versus a qubit’s two-dimensional state). This can be advantageous for certain tasks like simulating continuous systems or encoding multiple qubits into one mode using clever encodings. On the flip side, controlling an infinite-dimensional system with precision is hard – small noise can shift a continuous variable by a small amount and still cause errors, whereas a qubit might flip or not flip. DV qubits are often conceptually simpler (on or off states) and a large body of quantum algorithm theory has been built around qubits. In summary: CV is an analog quantum model vs DV is digital; CV can leverage well-developed optical tech, while DV has more developed error-correction routines so far.
  • Superconducting Qubits vs. Photonic CV: Superconducting quantum processors (used by IBM, Google, etc.) use microwave photons confined to superconducting circuits as qubits. Interestingly, those microwave resonators are actually oscillators, so one could use them in a CV manner – but typically they are operated in a regime where only two energy levels act as a qubit (a DV approach). Superconducting platforms have achieved high-fidelity gate operations (99%+ for two-qubit gates in some cases) and have demonstrated small error-corrected codes, but they require cryogenic temperatures and face challenges in scaling beyond a few hundred qubits due to microwave wiring and cross-talk. Photonic CV systems, by contrast, operate at room temperature (the light itself doesn’t need cooling, though some detectors might) and scaling in number of modes is conceptually easier by just adding more light pulses or spatial channels. However, photons do not naturally interact with each other, whereas superconducting qubits have direct couplings (through circuit elements) that make two-qubit gates straightforward. Photonic CVQC often relies on interference and measurement for effective interactions, which can be probabilistic or require large entangled resources. Another difference: superconducting circuits typically hold qubits in place (static physical qubits on a chip), executing gates in time, whereas photonic systems often push data in motion (flying qumodes through a circuit). This means that photonic approaches can use time-multiplexing to reuse the same physical components for many modes – an elegant form of scalability – but it also means one needs precise synchronization and real-time feed-forward as the photons race through. Both platforms are being pursued for fault-tolerant quantum computing, and in fact hybrid architectures are considered (e.g., using microwave-to-optical interfaces to connect a local superconducting processor with photonic links for networking).
  • Trapped Ions vs. Photonic CV: Trapped ion quantum computers use individual ions (charged atoms) as qubits, storing information typically in discrete internal states of each ion. They achieve some of the highest gate fidelities and have very uniform qubits, but operations are relatively slow (microsecond to millisecond gate times) and scaling to large numbers (beyond ~100) is challenging due to interference between ions and the difficulty of handling many laser beams. Photonic CV systems, on the other hand, can have extremely fast operation rates (GHz-scale optical modes) and parallelism (many modes propagating at once), enabling potentially massive throughput. The trade-off is that ions have strong Coulomb-mediated interactions that make multi-qubit gates possible with careful laser pulses, whereas photons require intermediary strategies to “talk” to each other (beam splitters + ancilla measurements, or an optical nonlinearity which is weak in most materials). Moreover, ions are essentially static qubits held in a trap, whereas photonic modes are usually not stored for long (though optical delay lines or fiber loops can serve as short-term memory). In terms of error correction, both trapped-ion and superconducting qubit systems have made progress with qubit codes (like surface codes, Shor code, etc.), while photonic CV is exploring alternative codes (GKP, bosonic codes) that could potentially offer more efficiency for certain error types (like photon loss). An interesting bridge: trapped ions themselves have vibrational modes which are continuous variables, and in fact early proposals for CVQC were demonstrated in ion trap systems (using the ions’ collective vibrational modes as qumodes). So, trapped ions can embody both paradigms: DV (internal states) and CV (motional states).
  • Other Photonic Approaches (Discrete-variable photonics): Photons can also be used in qubit-like ways. For example, the field of linear optical quantum computing (LOQC) uses single photons as qubits, typically with dual-rail encoding (presence of a photon in one path = |1>, in the other = |0>). The famous Knill-Laflamme-Milburn (KLM) scheme (2001) showed that you can do universal quantum computing with single photons, beam splitters, and photon detectors – but the gates are probabilistic, requiring many attempts to succeed. In practice, discrete photonic qubit approaches have faced challenges in scaling because deterministic two-photon gates are hard (they need either giant nonlinearities or complex interference with ancillas). CV photonic computing offers a different route: using squeezed states (which can be generated deterministically by optical parametric amplifiers) to create large entangled webs, and performing computations by measurements that are also deterministic (homodyne measurements always give a result, as opposed to single-photon approaches where you might sometimes get vacuum or need post-selection). The big photonic quantum computing companies illustrate this split: Xanadu focuses on CV (squeezed light and Gaussian boson sampling), while PsiQuantum is betting on discrete single-photon qubits and photonic circuits for a fault-tolerant one-way computer (they use single-photon cluster states). PsiQuantum’s approach still requires probabilistic fusion gates to build large cluster states and thus needs billions of photons to get a large fault-tolerant machine. In contrast, CV cluster states can be built more directly with squeezers, but then the burden shifts to the detection and error correction side. Another photonic approach is boson sampling, a specialized model (not universal computing) where many indistinguishable photons are sent through a linear interferometer – the original proposal was with single photons (a DV approach), but Gaussian boson sampling (GBS) uses squeezed states (CV) as input. GBS has been at the heart of recent quantum advantage demonstrations​, highlighting how mixing continuous variables (squeezed vacuum states) with discrete detection (photon counting) can yield a powerful computational task.

In summary, CVQC differs from other paradigms in how information is encoded and manipulated. It often trades off easier state preparation and entanglement (thanks to Gaussian operations) with more challenging logic gates (needing non-Gaussian elements). It leverages optical hardware that scales in modes, offering potential room-temperature and high-bandwidth operation, whereas leading qubit modalities have strong interactions but need heavy infrastructure. It’s not that one paradigm strictly dominates – indeed, research is ongoing into hybrid schemes that try to get the best of both (for instance, using CV hardware with embedded qubit encodings, or connecting superconducting qubits via optical CV links). Ultimately, both continuous and discrete models are capable of universal computation​; the race is on to see which can achieve practical, large-scale quantum computing first, or whether they will coexist and specialize in different tasks.

Current Development Status

Photonic continuous-variable quantum computing has seen rapid development on both the experimental research front and in the emergence of companies building prototypes. Here we summarize the latest progress:

  • Large-Scale Entangled States: As mentioned, an impressive milestone was the generation of ultra-large CV cluster states. In 2019, researchers generated cluster states with over one million entangled modes in a one-dimensional chain​, and shortly after, fully two-dimensional cluster lattices involving thousands of modes​. These experiments used a time-domain multiplexing technique: a pulsed laser and optical delay lines were used to entangle a continuous train of squeezed light pulses into a massive cluster state. The resulting entangled state can be thought of as a “resource” that is large enough (in principle) to perform many sequential quantum operations or to serve as the backbone of a one-way quantum computer. The current state-of-the-art cluster is a 2D square lattice of 5×1240 modes demonstrated by Asavanant et al.​. Such resource states are universal for CV quantum computing (provided non-Gaussian measurements like GKP syndrome measurements are incorporated) and are compatible with error-correcting codes​. The production of these states was a proof-of-scalability for CVQC – showing that one can entangle orders of magnitude more modes than there are qubits in any current discrete platform.
  • Gaussian Boson Sampling and Quantum Advantage: A major headline in recent years was the achievement of quantum computational advantage with photonic processors. In 2020, the USTC group in China (Jian-Wei Pan’s team) used a boson sampling setup with single photons (a DV approach) to claim a quantum advantage. In 2022, Xanadu (a Canadian quantum computing company) demonstrated Gaussian boson sampling on their device Borealis, taking the idea further​. Borealis is a photonic processor that injected 216 squeezed-light modes into a reconfigurable three-stage interferometer, effectively entangling them in a complex Gaussian state, and then measured the output with photon-number-resolving detectors​. The experiment implemented GBS in a programmable way (the interferometer’s beam splitter phases were programmatically set), and the sampling results clearly outpaced what any classical supercomputer could do​. Specifically, producing one sample from the target distribution would take classical algorithms an estimated 9,000 years, whereas Borealis did so in microseconds. This was a special-purpose computation (sampling from a probability distribution defined by a random optical network), not a general algorithm like factoring. But it demonstrated that photonic CV hardware had reached a level of scale and complexity beyond classical reach, marking it as the first publicly available quantum machine with a claimed advantage (Borealis was made accessible via the cloud)​. This achievement underscored the viability of photonic CV approaches for near-term quantum advantage and stimulated interest in what other tasks GBS or related photonic systems could tackle.
  • Small-Scale Quantum Algorithms: Alongside big demonstrations, researchers have also implemented proof-of-concept algorithms on CV hardware. For example, in 2023, Enomoto et al. (University of Tokyo) reported running a continuous-variable version of the Quantum Approximate Optimization Algorithm (QAOA) on a programmable photonic quantum processor​. They used a few-mode CV system (with squeezed light and an adaptive measurement-based circuit) to solve small instances of an optimization problem, highlighting that algorithms originally formulated for qubits can be mapped to CV circuits. This kind of experiment shows the flexibility of CV hardware: by programming the measurement basis and using feed-forward control, one optical setup can realize different quantum circuits or algorithmic routines.
  • Integration and Chips: On the hardware side, there’s a push toward integrated photonics for quantum computing. Traditional quantum optics experiments often involve tabletop optical components. Today, researchers are moving these onto photonic chips (using materials like silicon, silicon nitride, or lithium niobate). For CVQC, integration means having on-chip sources of squeezed light (e.g., ring resonator optical parametric oscillators), on-chip interferometers, and potentially on-chip detectors. Xanadu, for instance, uses silicon nitride chips with embedded nonlinear waveguides to generate squeezing, routing the light through reconfigurable interferometers. In 2021, a collaboration demonstrated a four-mode chip that produces squeezed light and interference for a simple CV quantum computation task. The aim is to scale up the number of modes on chip and reduce loss, to eventually have a fully integrated photonic quantum processor. The European Quantum Flagship project “CLUSTEC” is focusing on thin-film lithium niobate integration for CV cluster states​, which could allow low-loss and fast control in photonic circuits. While integrated photonic CV computers are still in early prototype stages, progress is steady, and they promise greater stability and scalability (just as integrated electronic circuits replaced bulk electronics).
  • Companies and Commercial Efforts: Several startups are targeting photonic quantum computing, seeing an advantage in room-temperature operation and easy scaling in some dimensions. The most notable for CV is Xanadu (based in Toronto). Xanadu has openly embraced the CV model: they developed the software library Strawberry Fields for CV quantum programming, have built devices like the 8-mode X8 and the 216-mode Borealis, and are pursuing a roadmap toward a fault-tolerant photonic quantum computer. Xanadu’s approach combines Gaussian operations on chip with off-chip photon counting to supply non-Gaussian capability, as evidenced by Borealis. Another player is PsiQuantum (based in California); however, PsiQuantum is focusing on photonic qubits (single-photon DV approach) rather than CV – aiming for a very large scale (they claim a plan for a million-qubit photonic cluster state machine). PsiQuantum’s work is relevant to photonic QC broadly, but it does not use continuous variables. IBM and Google, the big names in quantum computing, are focused on superconducting qubits, though IBM Research has explored hybrid CV encoding (like demonstrating small GKP states in superconducting microwave cavities). In the broader photonic ecosystem, companies like QuiX Quantum (Netherlands) produce photonic chips for quantum information (mainly interferometers for boson sampling or discrete quantum walks) and could potentially be used for CV experiments too. On the quantum communication side (closely related to CVQC), companies like Toshiba and QuantumCTek work on continuous-variable QKD systems, pushing the technology of squeezed/light detection which also benefits CVQC. In summary, Xanadu stands out as the leader in photonic CV quantum computing commercialization, having delivered both scientific milestones and cloud-accessible prototypes. Their success with Borealis​ has validated the approach and is likely to be followed by incremental improvements (more modes, better fidelity, integration with error correction). The field has a healthy blend of academic research and startup activity, with significant interest in leveraging photonic CV platforms for near-term quantum computing tasks.
  • Noisy Intermediate-Scale Quantum (NISQ) Experiments: Current photonic CV devices are still in the NISQ era – they don’t have full error correction and thus computations are noisy and of limited depth. Nonetheless, researchers are finding NISQ-era applications for them. One area is quantum machine learning (see next sections for more) – e.g., variational quantum circuits on CV hardware can be used for generative modeling or simple classification tasks. Another area is using GBS devices for graph-based problem heuristics (like finding dense subgraphs or doing molecular vibronic spectra calculations). These applications are under exploration right now, essentially testing what useful things can be done with the photonic CV hardware available today. The development status can thus be characterized as: basic capabilities demonstrated (state generation, simple algorithms, quantum advantage in sampling), integration in progress, quest for error correction underway.

In conclusion, photonic CV quantum computing has moved from theory to practice: massive entangled states have been created, specialized computations have exceeded classical capabilities​, and early quantum processors are accessible outside the lab. The focus now is on improving the quality (reducing loss, increasing squeezing, better non-Gaussian operations) and incorporating error mitigation or correction, while scaling up the size of systems further. Companies like Xanadu are driving this forward with tangible prototypes, and ongoing research suggests even more exciting demonstrations (perhaps a small logical qubit encoded in light, or a CVQC performing an optimization better than a classical heuristic) in the near future.

Advantages of Photonic CVQC

Photonic continuous-variable quantum computing offers several notable advantages and unique features compared to other approaches:

  • Scalability in Number of Modes: Perhaps the biggest strength demonstrated so far is the ability to entangle a very large number of modes using CV techniques. Through time-multiplexing or frequency multiplexing of squeezed light, experiments have generated cluster states with thousands or even millions of entangled modes​. This kind of scale is far beyond the count of qubits in any current discrete system. It suggests that, in terms of network size, CV systems can be scaled up relatively easily by extending the pulse train or adding more modes, without needing individual physical qubit carriers for each. Each mode is just a pulse of light, and creating more pulses is straightforward with a pulsed laser. This inherent parallelism and mode scalability is a huge plus for CVQC.
  • Deterministic Entanglement and Gates: Using optical parametric processes (like parametric down-conversion which produces squeezed light) and linear optics, entanglement can be generated on-demand and deterministically. In contrast, many photonic discrete schemes rely on probabilistic entangling operations (e.g., the KLM protocol where two photons only entangle with some probability). With CV cluster states, as long as the squeezers and interferometers are operating, entangled modes flow out continuously. Deterministic Gaussian gates means fewer failed operations and less overhead in repeat-until-success gating. This gives CV systems a potential high success rate for operations, which is crucial when scaling up.
  • High Speed and Bandwidth: Photons travel at the speed of light and can be processed at GHz or higher rates. This means photonic CVQC can have an extremely high clock speed compared to, say, ion traps (which operate in kHz regime) or superconducting qubits (~MHz). For instance, a time-domain multiplexed system might process pulses at tens of MHz or more​, enabling a huge number of operations per second. Additionally, different frequency modes of light can be used in parallel (wavelength multiplexing), leveraging the vast bandwidth available in optical systems. This could allow massively parallel operations and high-throughput quantum information processing.
  • Room-Temperature Operation: Photonic systems (optical photons) do not require cryogenic cooling. The quantum states of light can survive at room temperature (unlike superconducting qubits which need dilution refrigerators, or even ion traps which need ultra-high vacuum and laser cooling). This simplifies the engineering and potentially reduces the cost. (One caveat: certain detectors, like superconducting nanowire single-photon detectors used for non-Gaussian measurements, do require cooling. However, homodyne detectors — which measure quadratures — can be made from room-temperature photodiodes.) Overall, the core processing can be done on a benchtop with optics, making it conceivable to have photonic quantum processors in standard laboratory or even commercial environments without special cryogenics.
  • Established Optical Technology: CVQC leans heavily on the toolbox of classical and quantum optics, which is very mature. High-quality lasers, optical fibers, integrated photonic circuits, and modulators are all existing tech. This means CV hardware can piggyback on decades of development in fiber-optic communications and photonic integration. For example, modulators used in telecom can serve as fast phase shifters in a CV quantum circuit, and fiber technology can guide quantum light with relatively low loss (especially at telecom wavelengths). The use of telecom-band photons also means potential compatibility with existing infrastructure (fibers, network components), which is great for distributing quantum information or scaling systems by linking modules. In contrast, some qubit systems require entirely specialized fabrication and control electronics. The optical components for CV are off-the-shelf in many cases. As a 2012 review noted, “optical components effecting Gaussian processes are readily available in the laboratory”​, highlighting the practical convenience on the experimental side.
  • Natural Interface with Quantum Communication: Photonic CV systems operate with light, which is the medium of communication. This creates an opportunity to seamlessly integrate quantum computing with quantum communication. For instance, the same type of entangled squeezed-light states used in a CV quantum computer can also be used for quantum teleportation or sending quantum information through an optical fiber. Thus, one can envision distributed quantum computing networks where CV quantum processors exchange states over optical links (possibly enabling a quantum internet). Discrete photonic qubit systems also have this advantage to some degree, but CV encoding (like continuous modulation) can sometimes achieve higher information rates in channels. CV quantum key distribution (QKD) is already a practical technology in fiber networks, benefiting from standard telecom components, which indicates that CV quantum info carriers are well-suited for networking. A CV quantum computer could be more easily connected to a QKD system or a quantum sensor, all using light as the medium.
  • Error-Correcting Codes in Infinite Dimensions: CV systems offer new possibilities for error correction that don’t exist for qubits alone. The GKP code is a prime example – it leverages the continuous phase space of a mode to redundantly encode a discrete quantum bit in a way that small shift errors can be corrected​. These bosonic codes may prove more hardware-efficient for certain error models like small amplitude damping or photon loss. For example, a single bosonic mode with a GKP encoding can behave like a long-lived qubit, as demonstrated in microwave cavities​. If such encodings can be generated in optical modes, a CV cluster state computer could directly incorporate error correction at the level of each mode. The continuous nature also allows for analog error syndrome measurements – one can measure how far off a quadrature variable is from the nearest codeword value and correct by shifting it back. In principle, this could correct small errors in a more fine-grained way than qubit codes (which typically deal with full bit-flip or phase-flip errors). The potential of such bosonic qubit encodings is an advantage of CV platforms – you inherit an infinite-dimensional Hilbert space that you can structure to protect information.
  • Scalable Fault-Tolerance Schemes: Building on the above, theorists have devised fault-tolerant architectures for CV that seem promising. For instance, using a 3D cluster state plus GKP error correction, one can get a fault-tolerant scheme with a feasible squeezing threshold (~12.7 dB)​. This approach implies that once moderate squeezing and reliable GKP ancillas are available, one could implement topological error correction (like a surface code) in a photonic cluster state, achieving scalability to large, error-corrected computations. The ability to combine topological codes (discrete) with CV codes gives potentially more flexibility in optimizing against noise. While this is not yet realized, it’s a conceptual advantage that CV systems can leverage two layers of coding (continuous and discrete) to battle errors.
  • Potential Computational Power and Analog Processing: Some arguments have been made that CV quantum computers might be very efficient for simulating other quantum systems that are themselves continuous. For example, quantum simulations of chemical bonds, vibrational modes in molecules, or quantum field theories might map naturally onto CV hardware. A harmonic oscillator (like a molecular vibrational mode) is directly represented by a mode in CVQC, possibly making simulation of oscillatory systems highly efficient. Moreover, certain algorithms (like Gaussian boson sampling) exploit the ability of CV systems to handle combinatorially large state spaces (GBS effectively samples from a distribution over many photon number combinations). This suggests that for specific tasks (graph problems, sampling problems), CV machines might offer exponential advantages. Indeed, the advantage demonstrated with GBS is one instance of that​. More broadly, the continuous nature means CVQC might connect with analog computing ideas, where instead of digital bits, one uses analog values to do computation (with quantum enhancements). While we typically want a quantum computer to be universal and digital (to get error correction benefits), the analog aspect could in near-term be useful for things like solving certain nonlinear differential equations or as quantum analog processors for optimization. Research is ongoing in these directions.

In summary, photonic CV quantum computing offers outstanding mode scalability, deterministic operations, high speeds, and integration with existing tech and networks. It provides a different toolbox for error correction that could complement qubit approaches. These advantages make it a compelling approach, especially as experiments continue to push the limits on size and as the community works toward overcoming the remaining challenges (detailed next).

Disadvantages and Challenges

Despite its promise, photonic CV quantum computing faces a number of significant challenges and limitations. It’s important to understand these hurdles, as they define the current research directions and open questions:

  • Sensitivity to Loss and Noise: Photons can be lost in optical components (fibers, beam splitters, detectors), and any loss in a quantum optical system typically means irretrievable loss of information (since lost photons often equate to erasure errors). Continuous-variable encodings are particularly sensitive to loss and noise because they rely on the precise values of quadratures. For instance, a slight random phase noise or loss-induced amplitude damping will impart quadrature shifts. Unlike a qubit flip (which might be discretely detected and corrected), a small shift in a CV variable is a continuous error that can accumulate. While some shifts can be corrected with codes like GKP, any uncorrected noise directly degrades the quantum state (e.g., reducing squeezing or entanglement). In practical terms, optical losses and thermal noise limit the achievable squeezing and increase the effective noise in Gaussian operations. Current experiments might start with 10-15 dB squeezing at the source, but after passing through optics, the squeezing may degrade to lower values. High loss also impacts the success of non-Gaussian operations (like photon detection – losing a photon before it hits the detector can alter the outcome). In short, low-loss optical components and high-efficiency detectors are required, and those can be technologically challenging (though progress is being made, e.g., >99% efficient homodyne detectors exist in labs).
  • Requirement of Non-Gaussian Resources: As noted, Gaussian-only CVQC is not universal. The need for at least one non-Gaussian element is a major bottleneck. Currently, the primary ways to introduce non-Gaussianity are:
    • Photon counting measurements: e.g., measuring some modes with single-photon detectors. This is used in Gaussian boson sampling and can be used to induce entanglement or effective non-linear operations. The challenge is that photon-number-resolving detectors are still relatively low-tech (often relying on superconducting sensors) and can be slow or have dark counts.Non-Gaussian ancilla states: like the cubic phase state or GKP states. These are extremely hard to produce. A cubic phase state would require either a very large optical nonlinearity or a complex protocol with cat states or higher photon states. To date, a high-quality cubic phase operation has not been demonstrated in optics. GKP states have been produced in microwave cavities and ion traps, but no deterministic optical GKP state has been created yet (there are proposals, but they involve like photon subtraction from squeezed states, which succeed with low probability).
    The absence of easy, high-fidelity non-Gaussian resources is probably the single biggest roadblock to building a universal photonic CV computer today. One can do a lot with Gaussian states (simulations, boson sampling, Gaussian QKD), but to perform arbitrary algorithms or to correct errors fully, these special resources are needed. Researchers are investigating ways to generate these: e.g., using cat states (superpositions of coherent states) as a starting point for GKP, or using measurement-based schemes to distill non-Gaussian states from imperfect versions. But currently, this is an unsolved experimental challenge.
  • Finite Squeezing and Errors: Finite squeezing not only limits operations but also fundamentally limits the quality of a CV cluster state. An infinitely squeezed mode would be like a perfect qubit; with finite squeezing, the cluster state has some noise that effectively leads to computational errors if not corrected. One can treat finite squeezing as analogous to a noisy two-qubit gate in a qubit computer: if squeezing is too low, the entanglement is too weak to be useful for long computations. The Menicucci paper already pointed out the need for a non-Gaussian element for universality, and later work quantified the squeezing threshold for fault tolerance. Initially, it was around 20.5 dB​, meaning if you can squeeze each mode by ~100× in variance (20 dB is 100x reduction in noise variance), then error correction can keep up with errors​. To give perspective, 20 dB squeezing is extremely high – current record in optical labs is about 15 dB, and even that is with significant effort. More recent proposals lowered the threshold to ~12 dB by combining with GKP codes​, which is closer to feasibility, but still quite demanding. Every decibel of squeezing is harder to get: more pump power in the nonlinear crystal, better isolation from losses, etc. So insufficient squeezing is a limitation – below a certain level, large-scale quantum advantage or fault tolerance might not be reachable because the states are too noisy. There is also a theoretical limit: perfect quantum error correction on CV cluster states would require infinite squeezing if one did not use additional tricks, which is impossible, so practical CVQC must balance finite squeezing with clever encoding.
  • Error Correction Overhead: While bosonic codes like GKP are promising, they also come with heavy overhead in preparation and verification. To do quantum error correction in a CV system, one might need to frequently inject ancilla states (like GKP) and perform complex syndrome measurements. GKP error correction itself requires, for example, coupling the data mode to an ancilla and doing a high-resolution measurement of a quadrature modulo a spacing. Implementing these reliably with linear optics and finite resources is complex. Moreover, concatenating with discrete codes (as in some proposals) means you now need a lot of physical modes to represent one logical qubit (for instance, each GKP state might be a logical qubit, and then you need several of those to form a surface code patch). Thus, the resource overhead for full fault tolerance could be huge if approached naively – potentially thousands or millions of modes to get one logical qubit with long-lived coherence, when including everything (though similarly, discrete qubit schemes also need thousands of physical qubits for one logical qubit in surface codes). The field is still working on optimizing these overheads.
  • Real-time Feed-forward and Control: In measurement-based CVQC, once you measure one mode, you often need to apply a correction (like a phase-space displacement) to the remaining modes based on the outcome – this is the “feed-forward” in one-way quantum computing. In photonics, especially in a time-multiplexed system, this needs to happen very fast: before the next pulses arrive at whatever operation point that needs the correction. Implementing low-latency, high-bandwidth feed-forward electronics that can take a homodyne measurement result and adjust an optical modulator in nanoseconds is an engineering challenge. It’s being addressed (for example, recent experiments have demonstrated feed-forward at tens of MHz rates​), but scaling that to more complex computations (with many conditional branches) will be difficult. Essentially, a photonic quantum computer might need an integrated classical processor next to it, handling these adjustments on the fly – a demanding symbiosis of classical and quantum tech.
  • Lack of On-Demand Single-Photon Sources: Some advanced CV protocols (and also discrete ones) would benefit from a perfect source of single photons or other non-Gaussian states on-demand. Currently, most single-photon sources are probabilistic (SPDC sources) and thus not synchronized well with deterministic Gaussian operations. If you want to inject a single photon at a specific time into your otherwise Gaussian circuit (for example, to create a hybrid entangled state or a photon-subtracted squeezed state), you typically rely on heralding (waiting for a detector click that indicates a photon was generated). This probabilism can slow down or complicate operations. There is work on deterministic photon generation using quantum dots or other emitters, but those have not yet been integrated into CV experiments widely. Until such sources are available and integrated, adding the non-Gaussian “spark” into a CV system can be a stochastic process that reduces the overall efficiency of the computation.
  • Measurement Limitations: Homodyne measurements (measuring quadratures) have technical limitations in precision and efficiency. Finite resolution of the analog-to-digital converters, electronic noise in detectors, and phase stability of the local oscillator (the reference beam for homodyne detection) all impose practical limits. In an ideal theory, homodyne gives a real number result with infinite precision; in reality, one has sampling noise and cannot distinguish two outcomes that are extremely close. This effectively discretizes or coarse-grains the measurement, which could impact algorithms requiring high precision. For photon counting measurements, dark counts and saturation (where multiple photons arriving might be mistaken for fewer if the detector isn’t perfectly number-resolving) can cause errors. So improving detectors (both homodyne and single-photon) is an ongoing challenge.
  • Hybrid Complexity: CVQC often will be part of a hybrid system (with some discrete logic or at least discrete decisions being made from measurement outcomes). Managing the interface between the analog quantum domain and the classical digital domain is complex. For example, when do we decide that an analog value corresponds to a particular logical outcome? GKP codes face this: one measures a continuous syndrome (like a quadrature value) that tells you how much an error shifted the state, and then you decide which “hexagon” of phase space you were in to snap it back. If the noise was too large (beyond half a spacing), you mis-correct. Such issues mean that error thresholds can be unforgiving – just a bit more noise or uncertainty can cause logical errors.
  • Resource State Generation vs Utilization: CV cluster states can be extremely large (a plus), but using them for computation requires consuming the entangled modes via measurements. If one wants a broad quantum computer (not just a single quantum wire), one needs a two-dimensional cluster state structure and possibly multi-layer (3D) cluster for error correction. Generating a 3D cluster (like a cube-like lattice in time and two spatial dimensions) might be necessary for full fault tolerance​. This is far more complex than the 1D or 2D clusters generated so far. Each added dimension in entanglement typically means more sources or more complex routing of modes. It’s uncertain how easy it will be to scale from a 2D cluster of size N to a 3D cluster of similar size – likely it gets significantly harder. So, while the number of modes entangled has scaled, the structure of entanglement needed for universal QC with error correction is also complex. We might discover new hurdles in making, say, a topologically encoded cluster state.

In summary, the main disadvantages of photonic CVQC revolve around error sensitivity and the difficulty of achieving the required non-Gaussian elements and extremely high-quality states. High squeezing, low loss, good detectors – these are tough but incremental improvements are happening. The lack of a straightforward, high-fidelity nonlinear gate in optics is a fundamental issue; it is analogous to early classical computing lacking a transistor and trying to compute with only linear components – you need that nonlinear “kick” which for us is a non-Gaussian quantum operation. Until this is solved, CV quantum computers will either remain specialized (like boson samplers) or rely on heroic probabilistic schemes. Nonetheless, research is intensely addressing these challenges, for example by exploring photonics in alternate regimes (Rydberg atomic optics for stronger nonlinearities, or utilizing measurement-based nonlinearity as much as possible) and by engineering better error correction that can tolerate the levels of noise we have. The path to a large-scale CV quantum computer is demanding, but not fundamentally blocked – it likely will require a combination of incremental tech advances and clever error mitigation to overcome these disadvantages.

Impact on Cybersecurity

Photonic CV quantum computing intersects with cybersecurity in a few important ways, both posing new threats and offering new tools for secure communication:

  • Threat to Classical Encryption: Like all forms of quantum computing, if and when photonic CVQC becomes powerful enough (i.e., able to run Shor’s algorithm or similar at large scale), it could break widely used public-key cryptographic schemes. This is not unique to CV – any universal quantum computer with sufficient qubits or qumodes could factor large numbers and compute discrete logarithms efficiently, thereby compromising RSA, ECC, and Diffie–Hellman-based cryptosystems. In practice, current photonic CV devices are far from running Shor’s algorithm on RSA-sized numbers. However, the rapid scaling of modes in photonic systems (with experimental progress like one-million-mode entanglement) suggests that if error correction is mastered, a photonic CV quantum computer could reach the cryptographically relevant scale. This is part of the reason there’s urgency in developing post-quantum cryptography. It’s worth noting that some classical encryption schemes (one-time pads, AES with sufficiently large key size) remain safe even against quantum attacks, so the presence of quantum computers mainly affects asymmetric crypto. Organizations like NIST have been standardizing quantum-resistant algorithms as a precaution. From a security planning perspective, one must assume quantum capability will arrive (regardless of platform – photonic CV being one contender) and thus cryptography should migrate to algorithms that are not vulnerable to Shor’s or Grover’s algorithms.
  • Continuous-Variable Quantum Key Distribution (CV-QKD): On the positive side, continuous-variable quantum techniques provide ways to secure communications against eavesdroppers by the laws of physics. CV-QKD is a form of quantum key distribution where the sender modulates continuous variables (like the amplitude and phase of a coherent state of light) to encode a secret key, and the receiver measures these using homodyne or heterodyne detection​. The security comes from the fact that any eavesdropping induces additional noise (due to the Heisenberg uncertainty principle) that legitimate parties can detect by monitoring channel statistics. CV-QKD has some practical advantages: it can use standard telecom components (lasers and photodiodes) and typically operates at high pulse rates, potentially yielding high key rates (on the order of megabits per second over metropolitan distances in fiber). There have been successful demonstrations of CV-QKD over tens of kilometers of fiber, and even free-space CV-QKD (for satellite-to-ground links) is being explored. The current state-of-the-art CV-QKD protocols, like Gaussian modulated coherent state QKD, have been shown to be secure against general attacks in the finite-size regime, and they are reaching maturity​. This is directly relevant to photonic CVQC because many of the same components are used. A future photonic CV quantum computer might incorporate QKD modules to communicate its outputs or to form a secure network with other quantum nodes.
  • Quantum Cryptography beyond QKD: Beyond key distribution, CV quantum systems can enable other cryptographic primitives. For example, quantum digital signatures or quantum secret sharing schemes can be implemented with continuous variables. CV cluster states have been proposed as a way to distribute entanglement for networked cryptography protocols. Since CV cluster states can involve many parties (many modes) entangled together, one could imagine using them to perform multipartite secure communication, where multiple parties share a joint secret that only certain subsets can reconstruct (quantum secret sharing). These applications are still largely theoretical or in early experiment, but photonic CV systems are well-suited to them because they can create entanglement cheaply and distribute it to multiple parties via beamsplitters and optical links.
  • Security of Quantum Systems: There’s also the aspect of securing the quantum computer itself. If one is sending quantum states into a photonic CV quantum cloud (like Xanadu’s machine on the cloud), one might worry about the privacy of the data being processed (this is akin to blind or verifiable quantum computing protocols). Continuous-variable protocols exist for quantum authentication of optical states or for blind quantum computing where the server doesn’t learn the client’s data. For instance, a user could encode their qumodes in a Gaussian cluster state with some secret phase shifts such that the computing server cannot decipher the results without the key. These ideas are still being developed, but they highlight that cybersecurity in the quantum era will involve protecting quantum information in transit and in computation, not just classical data.
  • Post-Quantum and Hybrid Solutions: Photonic CV devices might also contribute to the development of quantum-safe encryption through novel algorithms. While CVQC can break some crypto, it might also run algorithms (like lattice-based cryptanalysis or heuristic solvers for certain hard problems) that provide insights into the strength of post-quantum algorithms. Additionally, one could use a photonic quantum random number generator (QRNG) – an optical device that uses quantum vacuum fluctuations to produce true random numbers – which is essentially a byproduct of CV quantum tech (the vacuum noise at a homodyne detector is a good entropy source). Such QRNGs are already commercially used to strengthen cryptographic key generation.

In essence, photonic CV quantum computing has a dual relationship with cybersecurity: it is part of the quantum computing threat driving the need for new cryptography, but it also offers practical quantum technologies (like CV-QKD and QRNGs) to enhance security today. The continuous-variable approach in particular has made QKD more accessible (since it can integrate with existing telecom infrastructure easily) and may in the future support complex secure communication networks by leveraging entangled light. As the field progresses, we can expect photonic CV systems to be key players in both the offense (code-breaking) and defense (secure comms) aspects of cybersecurity in the quantum age.

Broader Technological Impacts

Beyond cryptography, photonic CV quantum computing could influence a range of technologies and industries. Here we explore some broader impacts and potential applications:

  • Quantum Machine Learning and AI: Continuous-variable systems are very naturally suited to certain quantum machine learning (QML) models. For example, quantum neural networks (QNNs) have been proposed in the CV framework​. In a 2019 work by Killoran et al., they showed how a neural network structure can be implemented as a variational CV quantum circuit, with Gaussian gates implementing linear transformations and non-Gaussian gates as activation functions​. Photonic CV systems can encode data in continuous variables (much like classical ML deals with real numbers) and process them with a sequence of tunable quantum operations. Some demonstrations using Xanadu’s devices have included simple classification tasks (like identifying fraud patterns) and generative tasks (like a quantum optical network generating handwritten digit images or even playing a role in a hybrid classical-quantum autoencoder)​. The Strawberry Fields software by Xanadu provides a platform to design and simulate such QML models on CV hardware. The hope is that certain QML algorithms might run efficiently on NISQ-era CV hardware, potentially giving advantages in learning tasks (for example, faster combinatorial optimization in a quantum neural network than a classical one, or the ability to model distributions that are hard for classical nets). Additionally, since CV devices naturally output analog values (measurement results), they might interface well with classical neural nets (which also handle analog values) in a hybrid system. Overall, QML is a promising area where photonic CV hardware might find practical use, even before full fault tolerance, perhaps in tasks like pattern recognition, generative modeling, or even quantum-enhanced sensor data analysis.
  • Optimization and Sampling Problems: Many real-world problems – in logistics, finance, engineering – boil down to optimization of some cost function or sampling from complex probability distributions. Photonic CV computers have already shown strength in the sampling domain with boson sampling experiments. Gaussian Boson Sampling in particular has been connected to solving graph-based problems. For instance, one can encode a graph’s adjacency matrix into a covariance matrix of a Gaussian state, and the patterns of photons clicking at the output correspond to structures in the graph. Researchers showed that GBS can be used to find approximate solutions to the densest subgraph problem (an NP-hard problem in graph theory) by interpreting the output distribution​. There are also suggestions that GBS could help in molecular chemistry – e.g., calculating vibronic spectra of molecules (where the problem is basically sampling vibrational quanta distributions). More generally, a photonic CV computer could run variants of the Quantum Approximate Optimization Algorithm (QAOA) tailored to continuous variables (as already demonstrated in a small scale​) to tackle optimization tasks like portfolio optimization, resource allocation, or machine scheduling. Industries like finance or supply chain management may benefit if such quantum optimization solvers outperform classical heuristic methods. We’re not there yet in terms of scale, but these early experiments show the path. Another angle: CVQC might implement certain quantum heuristics for solving linear systems or differential equations, leveraging continuous variables to represent continuous quantities in the equations. This overlaps with quantum simulation as well.
  • Quantum Simulation of Physical Systems: One of the original motivations for quantum computers is simulating quantum systems. Photonic CV systems can efficiently represent bosonic systems (ones made of oscillators or field modes), which appear in many areas of physics and chemistry. For example, vibrational modes of molecules (phonons) or quantum field modes can be mapped to photonic modes. Simulating chemistry often involves both fermionic electrons (which map to qubits) and bosonic vibrational modes (which map well to qumodes). A combined discrete-continuous approach could be very natural for quantum chemistry – using qubits for electronic states and CV modes for phonons or solvent modes. Even on their own, CV quantum simulators could model phenomena like quantum optical systems (since they are quantum optical systems themselves), e.g., studying models of nonlinear optics, or simulating spin systems via analogies to coupled oscillator networks (through appropriate encodings like using the continuous variables to approximate spin phase spaces). While superconducting qubits have been mostly used for quantum simulation of spins or fermions, photonic CV devices might specialize in simulating open quantum systems (where environment noise is modeled as continuous variables) or quantum fields. Another domain is quantum metrology: photonic squeezed states are already used to improve precision (as in LIGO’s gravitational wave detection where squeezed light reduced the noise floor). CV entangled states (like two-mode squeezed states) can be used for quantum sensing, such as measuring phase shifts beyond the standard quantum limit. We see here that CV quantum info straddles computing, simulation, and sensing seamlessly, so advances in CVQC will likely feed into better quantum sensors and vice versa.
  • Integration into Existing Industries: Photonic CVQC could ride on the coattails of photonic integration in classical computing. For instance, classical silicon photonics is being explored for optical neural networks and optical computing to accelerate AI workloads. If photonic quantum processors can be integrated on similar platforms, one could imagine hybrid chips where classical photonics does some high-speed linear algebra and a quantum photonic co-processor tackles a part that benefits from quantum parallelism. Industries dealing with big data might use quantum sampling devices to reduce dimensionality or generate synthetic data more efficiently. The telecom industry might incorporate quantum repeaters or processors that use CV cluster states for quantum networks, enhancing secure communication infrastructure. In pharmaceuticals or materials, quantum simulators (possibly including CV ones for vibrational spectra) could predict molecular properties faster, speeding up R&D. It’s speculative, but if photonic quantum hardware becomes as practical as photonic classical hardware in the long run, it could see widespread deployment in data centers for specialized tasks – particularly because, being photonic, it might interface nicely with optical communication inside and between data centers.
  • Sensing and Metrology: This is tangential to computing but worth noting: CV quantum techniques have already revolutionized high-precision sensing (gravitational wave detection being a prime example, where squeezed light improved sensitivity by ~40% for certain frequencies​). In the future, more advanced CV entangled states (like multimode squeeze or cluster states) might be used in sensing networks – imagine an array of sensors (like interferometric telescopes or fiber acoustic sensors) that share entanglement to surpass classical limits in collecting data. Photonic CV quantum processors could act as the backend that processes the correlated sensor data (which is continuous) to extract more information than classically possible. This ties into machine learning as well, as quantum enhancements in sensors might go hand-in-hand with ML analysis of the signals.
  • Quantum Networks and Distributed Computing: With CV entangled states, one can envision a quantum internet where nodes share continuous-variable entanglement to perform joint computations or protocols (entanglement swapping, teleportation of states, etc.). For distributed quantum computing, two photonic CV quantum computers could be connected by sending light between them – potentially simpler than connecting matter-qubit computers that need transduction to photons. CV entanglement is generally easier to distribute over fibers than single-photon discrete entanglement, because coherent or squeezed states can be amplified (to some extent) and are more robust to loss (QKD with continuous variables can often tolerate more loss than single-photon QKD for similar distances). Therefore, CV quantum networks may allow not just key distribution but also distributed quantum processing – for example, two distant CV quantum processors combining their modes to tackle a larger problem than either could alone.

In summary, photonic CVQC has the potential to impact machine learning, optimization, quantum simulation, sensing, and networking. Many of these impacts derive from the same core capability: manipulating and correlating continuous quantum degrees of freedom at scale. While some applications (like quantum ML and optimization) are being tested on early devices, others (like fully quantum-integrated networks or simulation of complex quantum fields) will require more advanced hardware. The broader vision is a world where quantum and classical photonic technologies blend: secure communications channels feeding into quantum-enhanced processors that solve hard problems, possibly aiding everything from drug discovery to logistics and enabling ultra-sensitive measurement devices for science. Photonic CV quantum computing is a key piece in that vision due to its compatibility with optics and its unique strengths in handling continuous data.

Future Outlook

The future of photonic continuous-variable quantum computing looks promising, with a clear roadmap of challenges to overcome and milestones to achieve. Here are some predictions and expectations for the trajectory of research and development in this field:

  • Improved Non-Gaussian Capabilities: In the near to mid-term, a major focus will be on demonstrating high-quality non-Gaussian operations. We anticipate breakthroughs in generating resource states like the cubic phase state or small quantum error-correcting code states (GKP). For example, one possible milestone would be the on-demand production of a single GKP qubit encoded in an optical mode. There is active research on using cat-state breeding and photon subtraction to approximate GKP states. Within the next 5 years, it’s reasonable to expect experiments that show an optical GKP state with fidelity high enough to perform a single round of error correction on a simple logical qubit. Achieving this would be a game-changer, as it opens the door to incorporating error correction into photonic CV systems. Similarly, we might see the first implementation of a cubic phase gate (or an equivalent nonlinearity) via measurement-based methods, maybe by distilling a cubic phase state and teleporting a mode through it. These advances will likely come from improved photon sources and detectors, perhaps leveraging technologies like quantum dot single-photon sources or new nonlinear materials for stronger interactions.
  • Higher Squeezing and Lower Loss: On the hardware front, incremental progress in optics will accumulate. Reaching 20 dB of squeezing in an optical circuit (with manageable loss) is a target that labs are pursuing – this might involve engineering better optical cavities, new nonlinear crystals, or using cryogenic optical parametric amplifiers to reduce thermal noise. Likewise, integrated photonics may help reduce loss by shrinking the size of optical paths and avoiding fiber coupling. Lithium niobate on insulator (LNOI) waveguides have shown promise for combining nonlinearity (for generating squeezing) and low propagation loss. In the future, we may have a single photonic chip that generates, say, 8 squeezed modes at 15 dB each and interferes them with <1% loss, which would be a superb starting point for a CV processor. Achieving such high-quality states reliably will push CVQC closer to fault-tolerance thresholds.
  • Fault-Tolerant Demonstrations: Perhaps within a decade, we could see the first fault-tolerant logical qubit implemented in a photonic system. This could be done by encoding a qubit in a GKP state (or similar bosonic code) and using a CV cluster state with sufficient squeezing to protect it over a few computational steps, correcting errors as they occur. A concrete goal is to show that an error-corrected photonic logical qubit has a longer lifetime or lower error rate than the physical modes that comprise it – a definitive sign of beating the noise. Once individual logical qubits can be stabilized, the next goal would be to perform a logical two-qubit gate between them (perhaps via a larger entangled cluster or teleportation-based gate). The eventual aim is to implement a small fault-tolerant circuit (like a logical T-gate or a logical CNOT on two logical qubits). Such achievements would likely require 3D cluster states and combining CV operations with discrete error-correcting code techniques​, essentially realizing the architectures proposed in theory.
  • Scaling and Modular Systems: On the scaling side, rather than one monolithic system handling hundreds of thousands of modes, we might see a modular approach. For instance, multiple smaller photonic processors (each generating a medium-sized cluster state) could be connected via fiber links and quantum teleportation to form a larger computer. This modular quantum computing approach aligns well with photonics, because modules can be linked by optical channels naturally. Each module might be an integrated photonic chip that interfaces with others through standard fiber connectors. The Quantum Internet initiatives around the world will drive improvements in how we connect quantum systems, and photonic CV nodes could become part of networked quantum computing clusters. One could foresee a situation where, to get more computational power, you simply add more photonic modules and connect them with entanglement links, analogous to adding more servers to a classical data center.
  • Commercial Quantum Supremacy and Utility: In the next few years, we expect claims of quantum supremacy (advantage) to be solidified and extended. The boson sampling experiments are already there; the next step is to find a task that is not just hard for classical computers but also has some utility. This could be in sampling (e.g., generating certifiable random numbers, which is useful in cryptography), or in a specialized optimization algorithm. For photonic CVQC to break into industry use, it needs to perform a useful task better than classical methods. One candidate might be something like quantum-enhanced Monte Carlo simulation for finance (some proposals exist using Gaussian quantum states to estimate risk measures faster). Another could be protein folding or drug docking problems attacked via GBS or QAOA-like approaches. We predict that within ~5 years, a photonic quantum processor (maybe still noisy) will be used to solve a toy version of an industry problem (say, optimize a small portfolio or sample likely reaction pathways in a chemical process) and will show a hint of outperforming a brute-force classical simulation of the same toy model. This will garner a lot of attention and likely funding for scaling that up.
  • Integration into Classical Systems (Hybrid Computing): In the future, quantum computing is likely to be integrated into classical computing workflows rather than stand alone. Photonic CVQC has an edge here: since it’s based on light, it can interface with classical optical computing and communication. We might see co-processors where a classical computer offloads a subroutine to a photonic quantum processor. For example, a classical data center might use a photonic quantum co-processor to generate difficult random samples or to perform an expensive subroutine in an optimization. Cloud providers (like Amazon, Microsoft) are already offering access to various quantum chips; one can imagine specialized photonic CV chips being available as a service, maybe accelerated by classical pre- and post-processing that is optimized for their continuous output. A concrete example could be a quantum sampling service: a photonic chip that given some input parameters, outputs samples from a complex probability distribution that classical methods struggle with – useful in machine learning for boltzmann sampling or probabilistic inference tasks.
  • Quantum Network Integration: Further down the line (10+ years), if quantum networks mature, photonic CV quantum computers could become nodes on a global quantum internet. They could exchange entangled states and jointly perform distributed quantum algorithms. This has profound implications: it could enable secure delegated quantum computing (where a user’s qubits are distributed across multiple providers so that no single provider knows the whole computation), redundancy and cloud quantum computing (jobs could be teleported between quantum servers), and scalability (combining remote resources to act as one bigger computer). Photonic CV systems, being communication-friendly, are likely to play a large role here. Projects like CLUSTEC already emphasize developing networking protocols on CV cluster states​. So, the future might see CV quantum processors not as isolated devices, but as networked quantum processing units (QPUs) that can flexibly connect and share workload.
  • Cross-Pollination with DV Qubits: We will likely see more hybrid continuous-discrete systems. For example, there could be a quantum computer that uses superconducting qubits for some tasks and optical CV modes for others. Already, experiments have entangled microwave (superconducting) oscillators with optical photons via transducers, hinting at a hybrid quantum network connecting different types of systems. A future quantum computing platform might use DV qubits for high-fidelity logic and CV photonics for communication and certain parallel operations. The synergy might also come in error correction: some discrete qubit systems may adopt bosonic encodings (like storing qubits in microwave CV modes and error-correcting them with CV techniques – a path pursued by Yale’s group). Conversely, CV systems might use discrete ancilla qubits (like a quantum memory that can interact with a photonic mode to help error-correct it). The boundaries between CV and DV could blur, and the most successful architectures might be those that take advantage of both paradigms.
  • Towards Universal Quantum Computing: In the ultimate outlook, photonic CVQC aims to realize a universal, large-scale quantum computer. If the technical challenges are met, this would mean a machine with on the order of millions of physical modes, error-corrected to yield maybe thousands of logical qubits (or qumodes), performing algorithms like Shor’s, Grover’s, and quantum simulations of big molecules or materials that classical computers cannot handle. Such a machine could factor large numbers, search unstructured databases, simulate high-temperature superconductors, etc. Achieving this is probably more than a decade away, but the building blocks are being assembled now. Many experts believe that to reach this endgame, fault tolerance is essential – which implies a huge overhead and engineering effort. Photonic CV systems will need an architecture for error correction that is competitive with things like the surface code on superconducting qubits. If GKP codes and CV cluster states live up to their promise, photonic systems could get there. There’s also research in topological photonic states (like using the frequency comb structure to create a topologically protected CV encoding) which might reduce overhead by having built-in protection.
  • Timelines and Milestones: It is always hard to forecast timelines in quantum computing, but a rough speculation: By 2030, we might have photonic CV devices with ~50-100 modes in use for specific tasks (like a modest speedup in a machine learning pipeline or a specialized scientific simulation) – essentially early quantum accelerators. By 2035, perhaps a demonstration of a photonic CV quantum repeater or network linking devices over long distances, enabling distributed computing or long-distance QKD with error correction. And beyond 2035, potentially a full-stack fault-tolerant photonic quantum computer at a scale challenging classical supercomputers. Companies like Xanadu have aggressive roadmaps (e.g., aiming for a million qubits, albeit that number in photonic context means modes, by late 2020s or 2030s), though such goals often adjust over time.

In conclusion, the future of photonic CV quantum computing will likely involve solving the current technical puzzles around non-Gaussian operations and error correction, gradually building up to more complex and reliable systems. We will see greater integration: integration of components (on chips), integration with classical computing, and integration into networks. Each incremental achievement – a better source, a larger entangled state, a corrected logical qubit, a useful quantum advantage – will pave the way for the next. Given the progress so far, photonic CVQC stands as one of the most exciting and viable paths toward large-scale quantum computation, promising a fusion of quantum computing with the communication and processing infrastructure of the future​. If successful, it could usher in a new era of computational capabilities across science and industry, while working hand-in-hand with technologies that secure and transmit information in fundamentally new ways. The coming decade will be crucial in determining how this vision materializes, and the pieces are rapidly falling into place.

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven professional services firm dedicated to helping organizations unlock the transformative power of quantum technologies. Alongside leading its specialized service, Secure Quantum (SecureQuantum.com)—focused on quantum resilience and post-quantum cryptography—I also invest in cutting-edge quantum ventures through Quantum.Partners. Currently, I’m completing a PhD in Quantum Computing and authoring an upcoming book “Practical Quantum Resistance” (QuantumResistance.com) while regularly sharing news and insights on quantum computing and quantum security at PostQuantum.com. I’m primarily a cybersecurity and tech risk expert with more than three decades of experience, particularly in critical infrastructure cyber protection. That focus drew me into quantum computing in the early 2000s, and I’ve been captivated by its opportunities and risks ever since. So my experience in quantum tech stretches back decades, having previously founded Boston Photonics and PQ Defense where I engaged in quantum-related R&D well before the field’s mainstream emergence. Today, with quantum computing finally on the horizon, I’ve returned to a 100% focus on quantum technology and its associated risks—drawing on my quantum and AI background, decades of cybersecurity expertise, and experience overseeing major technology transformations—all to help organizations and nations safeguard themselves against quantum threats and capitalize on quantum-driven opportunities.
Share via
Copy link
Powered by Social Snap