Quandela

Table of Contents
(This profile is one entry in my 2025 series on quantum hardware roadmaps and CRQC risk. For the cross‑vendor overview, filters, and links to all companies, see Quantum Hardware Companies and Roadmaps Comparison 2025.)
Introduction
Quandela is a French quantum computing company founded in 2017 as a spin-off from the Centre for Nanoscience and Nanotechnology (C2N) in Paris. It has established itself as a pioneer in photonic quantum computing, focusing on single-photon-based qubits and integrated photonic circuits. Quandela’s core modality is optical (photonic) quantum computing, leveraging single photons as qubits. This approach differentiates it from superconducting or trapped-ion platforms, enabling operation largely at room temperature and simpler cooling requirements.
In the global quantum ecosystem, Quandela is one of the leading proponents of photonic quantum hardware, alongside international peers like PsiQuantum and Xanadu, and is Europe’s foremost photonic quantum computing vendor. By combining expertise in semiconductor quantum light sources and optical circuits, Quandela is building universal gate-based quantum processors based on light, with the aim of achieving scalable and fault-tolerant quantum computers. The company is vertically integrated (full-stack), providing not only hardware but also a software stack (e.g. the Perceval quantum programming framework) and cloud access to its devices, positioning it as a comprehensive player in the quantum ecosystem.
Milestones & Roadmap
Quandela’s initial focus was on photonic components – notably creating high-quality single-photon sources from semiconductor quantum dots. By 2017-2019, its quantum dot light sources (branded Prometheus) were being adopted by research labs worldwide Building on this, in 2020 the founders decided to expand into constructing complete quantum computers, hiring experts in quantum algorithms and software. This led to the development of Quandela’s first prototype quantum processing unit (QPU), which was codenamed Arcturus. In late 2022, Arcturus became the first functional photonic quantum computer delivered by Quandela. This early machine, with 6 qubits, was deployed by Q4 2022 and served as a proof-of-concept for their architecture. According to the company, it featured 6 fully entangled photonic qubits, single- and two-qubit gate fidelities around 99%, and effectively infinite qubit coherence time since photons do not decohere. The Arcturus/MosaiQ-6 system was made accessible via the cloud in 2023, marking Europe’s first cloud-based quantum computer (6 qubits) available to external users. This milestone allowed manufacturers and researchers to run quantum algorithms remotely on a photonic processor, establishing Quandela’s cloud platform as a gateway to its hardware.
MosaiQ Series and Deployments: Building on Arcturus, Quandela launched MosaiQ as its flagship photonic quantum computing platform. The MosaiQ line is modular and upgradable, offering configurations from 6 up to 24 qubits. In March 2023, Quandela achieved a significant commercial milestone by delivering a MosaiQ system to OVHcloud, a major European cloud provider. This was the first sale and on-premises deployment of a Quandela quantum computer – installed in OVH’s datacenter in Croix, France. The machine (a 6-qubit photonic QPU in a rack) became part of OVHcloud’s offering for HPC-cloud integration, underscoring the “datacenter-ready” design of Quandela’s hardware. By late 2023, Quandela also opened a dedicated assembly and service facility for quantum computers in Massy, France, to streamline production and deployment. Internationally, the company didn’t stop at Europe: in 2024 it expanded to North America by launching a Canadian subsidiary and partnering with Exaion (EDF) to install a Quandela QPU at a data center in Quebec. In September 2024, the first European quantum computer in North America – a photonic QPU – went online at Exaion’s facility, connected to the PINQ² quantum innovation platform in Quebec. Around the same time, Quandela established a presence in Munich, with plans for hubs in South Korea and further international “quantum hubs” to drive adoption. By mid-2025, the company even set up a quantum computer in Sherbrooke, Canada (within a quantum innovation zone) to support R&D collaborations – demonstrating a growing global footprint for its machines.
Scaling Up – Belenos and Canopus: In May 2025, Quandela unveiled Belenos, a 12-qubit photonic quantum computer representing the next generation after MosaiQ-6. Belenos delivers roughly 4,000× more computing power than the earlier 6-qubit system, according to the company. This exponential gain is attributed to both the increased qubit count and architectural improvements. Belenos was made immediately accessible via Quandela’s cloud to over 1,200 users across 30 countries, and a fully integrated version is scheduled for delivery to the French national supercomputing center (CEA-TGCC) by end of 2025 as part of a EuroHPC initiative. The roadmap accelerates beyond this: Canopus, expected in 2026, is planned to double the qubit count again (to ~24 qubits), which Quandela projects as another 16-million-fold increase in computational power over the first generation. Within three years (by ~2028), the startup plans to surpass 40 qubits in a single machine. Notably, surpassing ~50 qubits is seen as a critical threshold where classical simulation becomes infeasible, so a >40-qubit photonic processor would exceed the simulation capabilities of any conventional supercomputer. This suggests Quandela is targeting a quantum advantage regime in the latter part of the decade.
Technical Progress and Fidelity: Alongside qubit count, Quandela tracks performance in terms of Quantum Operations Per Second (QOPS) and gate fidelities. The MosaiQ architecture (6-24 qubits) operates its single-photon sources at an 80 MHz clock rate, achieving on the order of 5×102 QOPS at 12 qubits and ~2×103 QOPS at 24 qubits. High gate fidelities are crucial for scaling: reported single-qubit gate fidelity is ~99.6% and two-qubit (entangling) gate fidelity ~99% on current hardware. These figures – achieved through calibration and error mitigation (“Corrected Fidelity”) – are on par with many superconducting qubit systems, and provide a baseline for implementing small quantum circuits reliably. The company has also demonstrated multi-qubit operations (e.g. a three-qubit Toffoli gate with ~90% fidelity) on its photonic platform.
Looking forward, Quandela’s published Technology Roadmap (2024-2030) is explicitly focused on fault-tolerant scaling. Key milestones include achieving the first logical qubit by 2025, reaching ~50 logical qubits by 2028, and operating “hundreds of logical qubits” by 2030 in a fault-tolerant universal quantum computer. To support this growth, Quandela plans to industrialize its manufacturing: it aims to assemble ~4 quantum computers per year from 2025 onward, and open a second quantum computer factory by 2027 to enable large-scale assembly of error-corrected machines by 2028. This aggressive roadmap is backed by the company’s track record of hitting its past targets – for instance, being first in Europe to offer a cloud QPU and to sell a quantum computer to a private client. It is also supported by public R&D programs: Quandela was selected in 2024 as one of five companies in the French DGA’s PROQCIMA program (a defense-funded initiative to achieve quantum advantage and eventually two made-in-France universal quantum prototypes by 2032).
In summary, Quandela’s roadmap lays out a clear evolution: from a few high-quality photonic qubits in the early 2020s, doubling qubit counts annually through the MosaiQ, Belenos, and Canopus generations, then transitioning to error-corrected logical qubits by the mid/late-2020s. By leveraging a modular photonic architecture and international partnerships (OVHcloud, Exaion/EDF, EuroHPC, etc.), Quandela is steadily scaling up hardware while delivering value at each step – e.g. making systems available via cloud and on-premises for real-world use cases even before full fault tolerance is reached.
Focus on Fault Tolerance
Achieving fault-tolerant quantum computing – i.e. quantum computers that can correct their own errors and reliably run deep algorithms – is a central goal in Quandela’s strategy. The company’s approach to fault tolerance is built on a hybrid photonic architecture called SPOQC (Spin-Optical Quantum Computing). In simple terms, SPOQC marries matter qubits (spins in quantum dot emitters) with photonic interconnects to get the best of both worlds. The basic idea is to use semiconductor quantum emitters that have an internal spin state (e.g. an electron or exciton spin) as stationary qubits, and use the photons they emit as “flying” qubits to entangle those spins across the processor. This enables a form of deterministic entanglement creation: when a quantum dot’s spin is manipulated and made to emit photons, those photons can be used to entangle the spin with other remote spins via optical interference. Compared to all-photonic schemes where two-photon gates are inherently probabilistic, the presence of a spin memory means an entangling operation can be repeated-until-success without losing qubit state.
SPOQC Architecture: In a 2024 scientific paper, Quandela’s researchers detailed this architecture. The system is envisioned as a network of quantum dot spin qubits (nodes) that are connected by photonic links (optical modes acting as edges). Entangling gates between any two spins are performed by interfering photons from those emitters using linear optics and single-photon detectors – a heralded entanglement scheme. If an attempt fails (no entanglement due to a probabilistic photon outcome), the system can try again by generating another photon, since the spins remain available; once the gate succeeds, the protocol moves on. This hybrid approach aims to realize the connectivity and fast gates of photonics while retaining the error correction benefits of matter qubits (which can serve as the logical qubit memory). Notably, SPOQC is designed to support standard quantum error-correcting codes (like surface codes or LDPC codes) with high connectivity. Because photons can link distant qubits easily, the architecture naturally supports non-local connections required by certain efficient error-correcting codes (e.g. allowing low-degree graphs or LDPC codes without 2D grid constraints).
A major claimed advantage of Quandela’s fault-tolerance scheme is resource efficiency. Their hybrid spin-photon design achieves loss tolerance comparable to all-photonic cluster-state architectures, but with far fewer physical components. Each photon in the SPOQC scheme needs to pass through only a single optical switch and a small interferometer before detection (as opposed to traversing many multiplexers and beam splitters in an all-photonic fusion network). This relative simplicity relaxes hardware requirements and reduces optical losses. In fact, Quandela announced that this approach can reduce the number of components per logical qubit by 100,000× compared to pure photonic schemes. (All-photonic error-correction methods, like those pursued by some competitors in the US/Canada, might require on the order of a million components to encode one logical qubit, whereas utilizing spin emitters as qubits could cut that to ~12 components, by one estimate.) This dramatic reduction addresses a critical challenge for photonic fault tolerance: minimizing photon loss and error rates by shortening the optical paths and complexity that photons experience. Fewer components mean higher overall transmission for the photons, which is essential for successfully manipulating and stabilizing large entangled states.
On the error-correction protocols side, SPOQC allows any standard code to be implemented, but it particularly excels with codes that benefit from long-range links (e.g. it could execute a 3D cluster-state code or a LDPC code with far fewer timesteps). Quandela has indicated that its design will target tens of physical qubits per logical qubit, a significantly lower overhead than some other platforms. Their roadmap explicitly calls for demonstrating the first logical qubit in 2025 – likely by encoding a logical qubit in a small photonic error-correcting code and showing improved error rates. By 2028, they aim to network multiple photonic processors to scale to ~50 logical qubits and beyond. Achieving these milestones will require implementing real-time feedforward operations in hardware, since error correction and adaptive measurements go hand-in-hand. (Feedforward means using measurement outcomes to condition subsequent operations, something naturally supported in spin-photon gates by classical logic.) Indeed, Quandela has highlighted that adaptive, measurement-based quantum computing is “the native route to quantum universality for photonic systems” and that feedforward control is crucial for their error correction schemes. They have already introduced feedforward capabilities in simulation (via Perceval software) and are working to integrate the required fast optical switches, low-latency electronics, and optical delay lines into their hardware in the coming years.
In summary, Quandela’s plan for fault tolerance centers on using photonic qubits in combination with matter qubits to enable error correction with minimal overhead. By leveraging deterministic photon generation from quantum dot spins (sometimes termed “artificial atoms”), they intend to create large entangled cluster states or error-correcting code states with far fewer resources than traditional photonic approaches. This SPOQC architecture is at the heart of their 2024-2030 roadmap, driving the progression from today’s small prototypes to, as CEO Niccolo Somaschi puts it, “large-scale fault-tolerant quantum computers by 2028 and beyond”. If successful, it would be a significant step toward practical photonic quantum computers capable of running long algorithms reliably.
CRQC Implications
Cryptographically Relevant Quantum Computing (CRQC) refers to quantum machines powerful enough to break modern cryptography (e.g. factoring large RSA keys or cracking elliptic-curve cryptography) – generally estimated to require thousands of logical qubits and sufficiently low error rates. Quandela’s roadmap is aggressive, but within the 5-10 year horizon it is targeting on the order of tens to a few hundred logical qubits. Specifically, by 2028 they aim for ~50 logical qubits, and by 2030, “hundreds of logical qubits” in operation. This would be a remarkable achievement, yet still likely below the threshold for running Shor’s algorithm on cryptographically relevant key sizes (which would require thousands of logical qubits and a large number of error-corrected operations). In other words, if Quandela meets its 2030 goals, it will be a major player in quantum computing but probably not an immediate cryptographic threat.
However, the trajectory of their technology is what matters for CRQC. Photonic quantum computing is considered highly scalable in principle – especially because of the networking modularity and the ability to leverage photonic integration. If by ~2030 Quandela has a fault-tolerant architecture with hundreds of logical qubits, scaling that further to thousands might be more an engineering/industrial effort than a fundamental science hurdle. The company’s focus on industrialization (multiple factories, assembly lines, etc.) and reducing component counts per logical qubit is directly aimed at making large-scale machines feasible. Additionally, their emphasis on connecting multiple processors via optical fiber by 2028 hints that they could scale through a distributed approach – linking smaller quantum modules into a larger quantum cluster, analogous to how classical supercomputers network many nodes. Such distributed quantum computing could, in theory, be used to factor large numbers if enough modules are linked.
In the next 5 years (2025-2030), Quandela’s tech is unlikely to reach the point of breaking RSA-2048, as that would require a leap to perhaps thousands of logical qubits. Their own milestones (50 logical qubits by 2028) indicate a more modest but still significant scale. This aligns with public initiatives like the French PROQCIMA program and EuroHPC projects that aim for intermediate prototypes in this timeframe. Governments and industry are watching photonic approaches closely because of their long-term potential for CRQC – indeed, fault-tolerant photonic machines are considered promising for factoring due to potentially lower error rates and easier scaling of qubit numbers. Quandela’s recent breakthrough claims (the 100,000× resource reduction) specifically mention prime number factorization as one of the impactful algorithms enabled by fault-tolerant quantum computing. It’s clear that breaking cryptography is on the radar once error-corrected machines are viable: the press release notes that fault-tolerant QC will unlock algorithms like factorization and that those are exactly the high-impact use cases classical computers can’t handle.
Bottom line: Quandela’s current roadmap, if executed on schedule, suggests that by the early 2030s they could have the building blocks of a CRQC-capable system, but not necessarily the full cryptanalysis machine yet. In a 5-year view, their photonic technology is unlikely to pose an immediate risk to cryptographic security, especially compared to some larger-scale efforts worldwide. Nevertheless, the progress towards dozens or hundreds of logical qubits is non-trivial – it represents a path toward eventually reaching CRQC. Quandela’s emphasis on error correction and modular scaling is aligned with what’s needed for CRQC, and the company is explicitly working on those necessary ingredients (high efficiency qubits, networking, error decoders by 2027, etc.). As such, their work contributes to the advancing timeline for when CRQC might become reality.
Modality & Strengths/Trade-offs
Quandela’s quantum computers use the photonic modality, meaning qubits are encoded in particles of light (such as single photons in specific polarization or spatial modes). This approach carries distinct advantages and trade-offs compared to matter-based qubits:
No Decoherence (Long Lifetimes): Photonic qubits do not suffer from environmental decoherence in the same way matter qubits do. Once created, a photon in free space will maintain its quantum state virtually indefinitely until it interacts or is absorbed. Quandela highlights that their photonic qubits have effectively infinite coherence time – there is no phase decay while a photon propagates. This is a stark contrast to superconducting or trapped-ion qubits, which have coherence times on the order of microseconds to seconds. The lack of decoherence means fewer errors accumulate simply due to the passage of time, easing the burden on error correction (aside from loss errors).
Room-Temperature Operation: Photonic quantum processors can operate at or near room temperature (with some components like detectors or sources requiring cooling). Quandela’s systems use semiconductor photon sources and superconducting nanowire detectors, which are housed in compact cryostats, but the core photonic logic runs at ambient temperature in integrated optical circuits. There is no need for dilution refrigerators or complex magnetic vacuum chambers for the qubits themselves. This significantly reduces the overhead and cost of operation, and as demonstrated by MosaiQ, allows the entire quantum computer to be integrated into a standard datacenter rack with an air-cooled cryocooler. The simpler cooling needs are a practical strength, potentially improving uptime and maintainability.
High Connectivity and Modularity: Photons can easily travel between distant locations via fiber or free-space, enabling natural communication between qubits over long distances. This means photonic qubits can have an all-to-all connectivity in principle – any two qubits can be entangled by interfering their photons, without the limitations of geometry that plague, say, a 2D superconducting chip. Quandela’s photonic chips are described as having all-to-all topology within the processor, and moreover, multiple such processors can be linked by optical fiber to create a larger entangled system. This modularity is a huge advantage for scaling: additional qubit modules can be added and entangled to grow the quantum computer, analogous to adding nodes to a network. The photonic approach is inherently networkable, which is key for building distributed quantum computing clusters or quantum internet applications.
Speed (High Bandwidth Operations): Photonic gates can be extremely fast, limited ultimately by how quickly optical switches and detectors operate. Quandela’s single-photon sources operate at 80 MHz (emitting 80 million photons per second), and their optical circuit gates and detectors can respond on nanosecond timescales. This allows a high rate of quantum gate operations per second (QOPS), measured in the hundreds to thousands of operations/sec even for a small number of qubits. In principle, photons, being fast, could enable high-clock-rate quantum processors that complete algorithms quicker than slower gating technologies (though classical control electronics and signal latency do impose limits).
No Crosstalk / Isolation: In photonic circuits, qubits are typically encoded in separate optical modes that do not interact unless specifically made to interfere at a beam splitter. This means photonic qubits exhibit negligible crosstalk; one qubit’s operation won’t inadvertently affect another if they are on different wavelengths or spatial paths. This isolation simplifies control – gating one photon usually won’t disturb others, unlike in, say, ion traps where Coulomb interaction can cause spectator errors.
Despite these strengths, photonic quantum computing faces critical challenges and trade-offs:
Loss and Detection Efficiency: The predominant source of error in photonics is photon loss. Photons can be absorbed or scattered in optical components (fiber, waveguides, beam splitters, etc.), or simply not detected due to detector inefficiency. Every additional component a photon passes through is an opportunity for loss. Quandela’s approach recognizes this: high overall optical transmission is needed to scale to many qubits. They mitigate loss by reducing components (using integrated photonic chips and minimizing stages) and employing high-performance detectors. Quandela uses superconducting nanowire single-photon detectors (SNSPDs) which can have >90% detection efficiency, but these require cryogenic cooling. Even with top-tier components, scaling to large photonic circuits with low loss remains challenging – ensuring that, say, 99% of photons survive through dozens of gates is an ongoing engineering battle. Efficient single-photon sources are also crucial; if the source occasionally fails to emit a photon, that is effectively a loss error. Quandela’s quantum dot sources (e.g. eDelight) are designed for on-demand emission of pure single photons with high brightness. Still, no source is perfect, and any missed photon emission could introduce errors in computations.
Probabilistic Gates and Need for Feedforward: In linear optical quantum computing, two-qubit gates (like CNOTs) typically rely on interference and measurement – e.g. the Knill-Laflamme-Milburn scheme – which only succeed with a certain probability. Without auxiliary resources, non-deterministic gates mean you must either try many times or use more complex entangled resource states. Feedforward logic is required to make probabilistic operations viable: the system must detect whether a gate succeeded (via measurement) and, if not, either repeat the gate or adjust the circuit on the fly. This need for real-time decision and fast optical switching is a major technical challenge unique to photonics. Quandela addresses it by developing fast optical modulators and electronic controllers to reroute photons or adapt measurements within nanoseconds. For example, to implement an entangled cluster state, some photons might be measured and based on that result, a switch will direct the next photon appropriately. The latency from detection to action must be shorter than the photonic qubit’s transit time through a delay line. Achieving, say, sub-nanosecond feedforward is at the cutting edge of today’s technology and requires ultra-low-latency electronics, low-loss optical delay lines (fiber loops or integrated spirals), and high-speed Pockels cells or switches. Until these are fully realized in hardware, photonic systems may not harness their full theoretical power. (Quandela has so far enabled feedforward in simulation and is incrementally adding it to hardware as components mature.)
Scaling and Manufacturing Complexity: While photonic chips can be made with semiconductor fabrication techniques, a large-scale photonic quantum computer still involves complex system integration: combining lasers, nonlinear sources (quantum dot in cryostat), fiber coupling, integrated interferometer chips, single-photon detectors, and electronics. Packing all this with stability is non-trivial. Thermal fluctuations can cause phase drift in interferometers, requiring active stabilization. Quandela’s recent research includes work on crosstalk mitigation in photonic integrated circuits to address phase-shifter interference and calibration issues on a 12-mode chip This shows that controlling many optical elements with precision is challenging as circuits grow. Additionally, unlike solid-state qubits that stay on a chip, photons travel – meaning mirrors, fiber connectors, and alignment matter. There’s an optical alignment and packaging hurdle: maintaining alignment of dozens of fiber inputs/outputs to chips and detectors over time without loss. Quandela’s strategy of using integrated photonics helps (since many interferometers can be on one chip), but connecting multiple chips or modules is still a mechanical and optical feat. Manufacturing yield is also a concern: fabricating identical low-loss waveguides or quantum dot sources with consistent performance is hard. Quandela has invested in a new cleanroom (opened 2024) to produce single-photon sources in-house, indicating the importance of controlling the fabrication process. The trade-off here is that photonic hardware may leverage semiconductor manufacturing, but it demands near flawlessness in optical quality – something that requires cutting-edge processes and potentially significant investment.
No Quantum Memory (without matter qubits): Pure photonic systems lack a native way to store a qubit – photons either propagate or are measured; they can’t be paused easily. (There are experimental optical memories, but they are typically inefficient or short-lived.) Quandela’s use of spin qubits in SPOQC is precisely to introduce a memory element. But currently, its smaller-scale machines (MosaiQ-6, Belenos-12) are largely photonics-only, meaning they rely on delaying photons in fiber loops to synchronize operations. This is a limitation: long delay lines introduce loss and only provide millisecond-scale storage. Without a true quantum memory, synchronization of many qubits becomes very complex. This is why the integration of semiconductor spins (as a form of quantum memory) in their architecture is crucial for the long-term.
Detector and Source Constraints: Photonic qubits require specialized hardware at both ends: single-photon sources and single-photon detectors. Both typically operate at cryogenic temperatures (SNSPDs at ~2-4 K, quantum dot sources often at 4-10 K). While these are much simpler cryogenics than a full qubit fridge, they still are a bottleneck for scalability – each added detector or source might need another channel in a cryostat, more wiring, etc. Quandela’s design uses a time-multiplexed source (a single quantum dot can supply many time-bin qubits), which helps reduce the number of physical sources needed. They also mention “pseudo-PNR detectors” for photon number resolution, implying using multiple SNSPDs or a small array to detect multiple photons – again adding complexity. Ensuring detector efficiency and low dark counts across potentially hundreds of detectors in a large system is challenging. Each detector also needs high-speed readout electronics.
In summary, photonic quantum computing offers the promise of inherently stable, fast and modular quantum processors that can operate in standard environments, which Quandela is harnessing through its single-photon and integrated optics expertise. The trade-offs involve overcoming loss and probabilistic operation through clever design (like hybrid architectures and feedforward) and solving significant engineering challenges in optical hardware. Quandela’s approach specifically tries to maximize the strengths – by using ultra-efficient sources, minimizing component count, and exploiting networking – while addressing the weaknesses via technological innovation (e.g., integrating spins to sidestep the probabilistic gate issue, developing better photonic circuits to reduce loss). It’s a modality that is high-risk, high-reward: difficult to scale, but if scaled, potentially very powerful and flexible.
Track Record
Quandela has built a solid track record in a short time, transitioning from a research spin-off to delivering functional quantum computers and advancing photonic quantum science. Key aspects of their execution history and ecosystem validation include:
Research Foundations and Publications: Quandela’s team boasts a deep research pedigree in quantum optics and semiconductor nanotechnology. Co-founder Pascale Senellart-Mardon, for example, is a CNRS researcher who pioneered solid-state single-photon sources. Early on, the company’s strategy was to capitalize on scientific breakthroughs (like high-purity single photons from quantum dots) and turn them into products They maintain an active R&D program; according to their site, Quandela’s researchers have published on the order of ~70 papers, including notable results in photonic chip design, quantum algorithms, and the SPOQC architecture. In 2024, their scientists published the “A Spin-Optical Quantum Computing Architecture” paper in the journal Quantum, outlining the blueprint for their fault-tolerant design.
They also collaborate with academia: in late 2024, Quandela and C2N established a joint research lab named QDLight to continue advancing quantum dot and photonic technologies for the computers. This indicates a commitment to scientific innovation alongside product development.
The company’s contributions have been recognized in the community; for instance, in October 2024 Quandela received the “Breakthrough Deep Tech Innovation” award from La French Tech in Germany, signaling external validation of its technology trajectory.
Product Delivery and Clients: On the commercial side, Quandela has hit milestones that few quantum start-ups have: actually delivering hardware to end users. The first delivery was the MosaiQ 6-qubit machine (Arcturus) in late 2022, used internally and for cloud access. Then in 2023 came the OVHcloud deployment, making Quandela “the first European player to sell and deploy a quantum computer to a private client”. OVHcloud’s endorsement (they are one of Europe’s largest cloud providers) is a strong testament; OVH is integrating the photonic QPU with classical infrastructure for HPC customers. By 2024, Quandela delivered a system to Exaion/EDF in Canada, meaning their machines are now in use on two continents.
Additionally, Quandela has provided access to over a thousand cloud users through its own platform – by mid-2025, 1,200+ researchers and partners had accounts to use the Belenos 12-qubit system online. This user base spans academia and industry (40% of users are in France, others across Europe, North America, Asia) and points to growing interest and engagement with their technology. The cloud access and on-premise sales demonstrate that Quandela’s hardware is not just a lab prototype but a deployable product. In fact, their MosaiQ machines are offered on a build-to-order basis – they even advertise the ability to deliver a custom photonic quantum computer in 8-10 months to organizations ready to host one. This is quite unique in the quantum industry and speaks to their confidence in replicating and shipping their systems.
Performance and Benchmarks: While photonic quantum computers are still relatively small in qubit count, Quandela has sought to validate their performance on meaningful tasks. A highlight was winning the BMW-Airbus Quantum Challenge in December 2024 in the category of quantum machine learning. Quandela’s team demonstrated an image classification and generation algorithm running on their hardware (likely a hybrid quantum-classical approach) that outperformed others, showcasing the speed of calculation and number of operations per data point their photonic processor could handle. The company stated that Belenos enables use cases in machine learning that competitors cannot yet tackle due to its computational throughput.
Moreover, EuroHPC’s selection of Quandela in 2023 to deliver a photonic quantum accelerator for a French supercomputing center is another third-party validation. EuroHPC chose a consortium led by Quandela (with partners like CEA and Atos) to integrate a photonic QPU with HPC – meaning experts assessed Quandela’s tech as the most promising for a European quantum boost.
The French Defense (DGA) PROQCIMA program selection in 2024 also underscores credibility; Quandela is among a few companies entrusted with government support to push towards quantum advantage prototypes. These selections often involve rigorous evaluation of technical maturity and potential, suggesting Quandela has passed multiple vetting processes.
Partnerships and Collaborations: Quandela has been proactive in building partnerships across the value chain. On the hardware side, they collaborate with photonics manufacturers and labs – for instance, a partnership with DTU in Denmark was mentioned for photonic device fabrication. They also teamed up with Quantinuum (another quantum leader) on a “photon recycling” method for improved photonic information processing, indicating a willingness to co-develop techniques.
On the industry application side, Quandela is working with large enterprises like EDF (Electricité de France) on quantum algorithms for real-world problems, such as detecting cracks in dam infrastructure. This is an example of co-design: developing algorithms hand-in-hand with the hardware to achieve “quantum utility” in an industrial setting. Another partnership is with Mila (Quebec’s AI Institute) in 2025, focusing on hybrid quantum-classical machine learning algorithms. Mila’s involvement (founded by Yoshua Bengio) gives Quandela a strong foothold in the AI+Quantum intersection, and they plan joint benchmarking of QML models on Quandela’s hardware vs classical approaches. This not only helps improve their algorithms but also increases the visibility of photonic QC in the AI community. The partnership announcement explicitly notes that it follows “the recent installation of a quantum computer in Sherbrooke”, emphasizing how deploying a machine locally facilitated deeper collaboration in Canada.
Community and Software: As a full-stack player, Quandela contributes to the quantum software ecosystem as well. They maintain Perceval, an open-source photonic quantum computing library, which allows users to design and simulate photonic circuits and now supports feedforward operations. By integrating Perceval with mainstream frameworks (they note compatibility with Qiskit, myQLM, etc. for their hardware), Quandela makes it easier for developers to experiment on photonic QPUs. This outreach and tooling have likely helped them grow that cloud user base. Their Quandela Cloud platform not only offers access to real QPUs but also high-performance simulators and a library of pre-built algorithms, lowering the barrier for new users to try photonic quantum computing. By building a community of users and developers early, Quandela is cultivating an ecosystem that can drive demand for its hardware.
In summary, Quandela’s execution history shows a company that has consistently turned R&D into deliverables: from selling quantum light source components used by many labs, to building one of the first cloud quantum computers in Europe, to being the first in Europe to sell a quantum computer to a business customer, and expanding internationally with deployed systems in France and Canada. Their ability to meet self-imposed goals (as noted in their 2024 roadmap, they have “always met or exceeded objectives in recent years”) builds confidence. They’ve also aligned with key institutional players (EuroHPC, DGA, national labs) and won industry challenges, which provides independent validation of their technology’s promise. All these factors contribute to Quandela’s reputation as a leading hardware developer in the quantum space, with a unique photonic approach that is now recognized alongside the more traditional superconducting and ion-trap efforts.
Challenges
While Quandela has made impressive strides, it faces several challenges – technical, manufacturing, and commercial – on the road to fully realizing its vision:
Technical Challenges in Hardware Scaling: Perhaps the most daunting issues are those intrinsic to photonic quantum computing hardware:
- Optical Loss and Component Quality: As mentioned, photon loss is the primary error source. To achieve fault-tolerant operations, Quandela must maintain extremely high optical throughput across increasingly complex optical circuits. This means every mirror, waveguide, beam splitter, and fiber connection must be optimized. Even with their hybrid SPOQC approach reducing component count, a logical qubit might still involve a dozen or more photonic elements. The challenge is that each element can introduce a little loss; ensuring that 99.9% of photons survive through a whole error-correcting circuit is non-trivial. Quandela’s breakthrough claim of 100,000× fewer components than all-photonic schemes sets a target, but delivering those few-but-perfect components in practice is hard. They will need exceptional fabrication precision for photonic chips (to minimize scattering loss and mode mismatch) and ultra-low-loss optical switches and delays. This is an active engineering battle: for instance, integrated optical switches (electro-optic or MEMS based) often have insertion losses of a few percent, which is too high at scale. Moreover, as systems grow, so do insertion points for loss (fiber coupling, etc.) Quandela’s new dedicated cleanroom for source fabrication and its research into better interferometer calibration underscore that tackling loss and imperfections is an ongoing effort. Reaching the needed performance might require new innovations in optical design (e.g., photonic crystal waveguides for low loss, better anti-reflection coatings, etc.).
- Speed of Feedforward (Timing Constraints): Implementing real-time feedforward control in hardware is another significant challenge. Photons move at ~20 cm per nanosecond in fiber; even a few meters of delay line only buys tens of nanoseconds of decision time. The classical electronics and logic to process a detection signal and reconfigure an optical element must operate on this timescale. Achieving nanosecond-scale low-latency control is at the edge of current technology (FPGA gate delays, signal amplification, etc.). Synchronization is extremely demanding: multiple photons need to be kept in sync while others are measured and while a decision is made. As Quandela integrates feedforward, they might encounter challenges in RF engineering and signal distribution that are new to photonics (which historically often did post-processing feedforward rather than real-time). Additionally, scaling feedforward to many qubits means many parallel classical channels to manage. There’s a risk of a control bottleneck: the more qubits, the more detections per second (on Belenos, potentially billions of detection events per second across detectors). Processing that without introducing delay or errors in routing is a formidable task. This is why the hardware constraints listed – nanosecond optical switches, low-loss delay lines, minimal latency – are essentially a wishlist of tech that’s still being refined. Ensuring these exist at scale is a prerequisite for fault-tolerant photonics, and a challenge Quandela must overcome step by step.
- Integration of Spin Qubits: The SPOQC approach calls for embedding quantum dot spin qubits into the architecture. Today, Quandela’s systems use the quantum dot mainly as a photon gun; using the spin of that quantum dot as a qubit adds complexity. It means you need to coherently control the spin state (with microwave or optical pulses), and have it remain coherent while multiple photons are emitted and interference-based gates occur. Maintaining spin coherence in a solid-state environment (quantum dot in cryostat) can be challenging due to interactions with the nuclear spin environment of the host material. Techniques like dynamical decoupling or isotopically pure materials might be needed. Additionally, reading out and resetting spins quickly to use them in an error-correcting code loop adds another layer of engineering (some form of fast optical spin readout or cycling transitions). Essentially, Quandela will have to merge the tech of solid-state spin qubits (similar to NV centers or quantum dot qubits as pursued in academic labs) with their photonic platform. This hybridization is cutting-edge and potentially difficult to troubleshoot because issues could arise from either the photonic side or the spin side. The upshot is that while the theoretical architecture is sound, the practical integration of a large number of spin-photon units with uniform performance is unproven and a key challenge ahead.
Manufacturing and Industrialization Challenges:
- Yield and Reproducibility: To build dozens, then hundreds, then thousands of qubits, Quandela will need to reliably produce many copies of its photonic and emitter components. Semiconductor quantum dots, for example, can vary in emission wavelength and efficiency from chip to chip. Ensuring each source meets the needed specs (indistinguishability of photons, brightness, stability) may require screening and binning devices or developing on-chip tunability (like electric or strain tuning to align wavelengths). Photonic integrated circuits likewise must be fabricated with low variability so that interferometers are balanced or can be trimmed. The company’s roadmap of a “second factory by 2027” implies a scaling of production, but the underlying processes might need refinement to reach high yield. Any yield shortfall (e.g., only 1 in 5 chips works as intended) could slow down deployment and increase costs. Additionally, packaging photonic chips with cryogenic sources and detectors is non-standard manufacturing – essentially building a hybrid photonic-electronic-cryogenic system. Quandela will have to develop custom assembly and packaging techniques (which they have started with their Massy assembly line) and ensure these processes can ramp up.
- Component Supply Chain: Some specialized parts of photonic QCs (like SNSPD detectors or certain high-end laser systems) have limited suppliers. Scaling up might be constrained by how many detectors can be produced or how many cryocoolers are available. Attocube (mentioned in a EuroHPC consortium with Quandela) might supply nano-positioners or cryogenic hardware; coordinating multiple vendors and ensuring each piece meets specs as systems scale is a managerial and technical challenge. Moreover, maintaining quality control across two continents (France and the new Canada hub) requires robust procedures; any misalignment or environmental difference could introduce inconsistencies.
- Energy and Footprint: While photonic machines avoid huge dilution refrigerators, they still consume power – e.g. lasers, cryostats, control electronics. Quandela cites that their largest envisioned quantum computers will consume far less power than HPC centers (orders of magnitude less), but engineering this requires efficiency optimizations. Each additional 3 kW cryocooler for a source or bank of detectors adds to power draw. Scaling to hundreds of qubits might require multiple cryostats. Keeping power and cooling manageable while scaling up is a practical hurdle. They will need to consolidate multiple quantum dot sources into one cryostat or find ways to multiplex many detectors into one cold space.
Software and Algorithm Challenges:
- Error Correction Software: As they push towards fault tolerance, developing the compilers and decoders for error-correcting codes is crucial. Quandela plans dedicated error-correction compilers by 2027. Ensuring their hardware and codes work together will be a challenge. Fast decoding algorithms are needed to correct errors on the fly, which could be computationally intensive. If error rates are only just below threshold, decoders must be very efficient to keep up. This is a whole domain of challenge beyond hardware: making sure the error correction overhead (additional qubits and gates) doesn’t overwhelm the system’s capabilities. It’s one thing to have 50 physical qubits with 99% fidelity, another to orchestrate them into a logical qubit with real-time syndrome extraction and correction.
- Demonstrating Quantum Advantage (Utility): On the application front, Quandela must navigate the interim period before full fault tolerance to show useful outcomes. They speak of pursuing “quantum utility” – practical advantages on real problems. This can be challenging with NISQ-level photonic machines that have 6-12 qubits. Competing modalities have achieved high-profile results (e.g., Gaussian Boson Sampling experiments with hundreds of squeezed photons, though not error-corrected). Quandela’s approach is more gate-based and few-qubit for now, so finding problems where a 12-qubit photonic machine excels over classical computing is non-trivial. They did target QML tasks (as seen with the BMW-Airbus challenge win), which is promising. But to keep investors and partners convinced, they’ll need to continue expanding the repertoire of use-case demonstrations – possibly in optimization, simulation, or machine learning – that highlight photonics’ strength (maybe in speed or parallelism). If their machines are perceived as not delivering quantum advantage soon, there’s a commercial risk. This is partly a software/algorithm challenge: identifying algorithms well-suited to their hardware’s characteristics (e.g. high parallel operations, optical interferometry strengths).
Competitive and Commercial Challenges:
- Competitive Landscape: In the global context, Quandela competes with well-funded efforts such as PsiQuantum (which aims for a million-photon cluster-state computer, in partnership with GlobalFoundries) and Xanadu (pursuing Gaussian photonics with algorithms like borealis). These companies, especially PsiQuantum, have raised large capital and have partnerships that could potentially outpace Quandela in scaling if certain hurdles are cleared. Quandela’s differentiation is its hybrid spin approach and quicker delivery of small machines. The challenge will be maintaining a technological edge and demonstrating scalability before a larger competitor achieves a breakthrough. For example, if PsiQuantum or another player demonstrates a logical qubit or a 100-qubit photonic sampler sooner, Quandela might face pressure to keep up. On the flip side, being in Europe with support from EU and French programs is an advantage, but also carries expectations to be Europe’s contender in the quantum race.
- Talent and Resources: Building cutting-edge quantum hardware requires top-tier talent in multiple domains (quantum optics, cryogenics, nanofab, software, etc.). As Quandela grows (they have ~100 employees as of 2024), scaling the team and retaining expertise is a challenge. They’ll need to train more engineers for assembly, more researchers for algorithm and hardware co-design. The multi-disciplinary nature might strain a startup-sized organization. Moreover, executing the roadmap likely demands significant funding – for new facilities, more R&D, etc. Ensuring continued investment (through revenue, grants, or VC funding) is a commercial challenge common to all startups in this space.
In summary, Quandela’s challenges span from the physics of keeping photons in line to the logistics of building machines at scale. Technically, they must perfect photonic circuits with minimal loss and integrate ultra-fast feedforward control – essentially pushing the state-of-the-art in optical engineering. They also must implement the novel spin-photon hardware which is unproven at scale. Industrially, they need to replicate their successes in a mass-producible way, with high yield and consistent performance, all while coordinating global deployments. And they must do all this in a race against time and competition, proving that their approach can deliver not just on paper but in real-world value. These are non-trivial hurdles, but if overcome, they will solidify Quandela’s position and could usher photonic quantum computing into the mainstream. The next few years – hitting the 2025 logical qubit demo and the 2026-2028 scaling – will be critical tests of whether these challenges can be managed.