QuiX Quantum
Table of Contents
(This profile is one entry in my 2025 series on quantum hardware roadmaps and CRQC risk. For the cross‑vendor overview, filters, and links to all companies, see Quantum Hardware Companies and Roadmaps Comparison 2025.)
Introduction
QuiX Quantum is a Dutch quantum technology company specializing in photonic quantum computing hardware. Founded in 2019 as a University of Twente spin-off, QuiX has quickly grown into a European leader in photonic quantum processors and systems. Unlike matter-based qubit platforms that use stationary quantum bits on a cryogenic chip, QuiX’s approach encodes qubits in photons – particles of light that serve as “flying” qubits traveling through optical circuits. This photonic modality allows quantum operations at room temperature and at light-speed, leveraging integrated photonic chips (made of low-loss silicon nitride) as the core processing units. By building quantum interferometer chips and combining them with single-photon sources and detectors, QuiX aims to realize a universal photonic quantum computer capable of executing any quantum algorithm.
The company has defined a clear R&D roadmap toward machines that are first universal, then error-corrected, and ultimately fault-tolerant, positioning its technology as a scalable pathway to practical quantum computing.
Milestones & Roadmap
Since its inception, QuiX Quantum has marked several important milestones on its hardware development journey. In late 2020, the company demonstrated a record-breaking 12-mode programmable photonic processor – essentially a chip-scale interferometer with 12 optical modes – and by early 2022 it had scaled up to a 20-mode Quantum Photonic Processor (QPP), the largest of its kind at that time.
These integrated photonic circuits are fully reconfigurable linear optical networks that can implement arbitrary unitary transformations on multiple optical modes with high fidelity (≈98.6% in the 12-mode chip) and low loss. As described in their paper “High Fidelity 12-Mode Quantum Photonic Processor Operating at InGaAs Quantum Dot Wavelength“. Such processors serve as the “engines” of a photonic quantum computer.
In 2022, on the strength of its processor technology, QuiX achieved a major commercial milestone by becoming the first company in the world to sell photonic quantum computers of any kind to an end-user. Specifically, the German Aerospace Center (DLR) acquired from QuiX an 8-qubit and a 64-qubit photonic quantum computing system as part of DLR’s Quantum Computing Initiative. This contract, valued at €14 million, represented the world’s first sale of a universal quantum computer based on photonics. The delivered devices integrate QuiX’s proven photonic processors with single-photon sources, detectors, and fast feed-forward control to create a modular photonic computing platform. The ultimate goal of that four-year project (launched in 2022) is to realize a 64-qubit universal photonic quantum computer for DLR, with an intermediate 8-qubit prototype as a testbed. This early sale underscored QuiX’s track record of translating photonic chip technology into full systems and highlighted the confidence of large institutions in its approach.
By 2023-2024, QuiX continued to expand both its technology and its accessibility. In 2024 the company launched a cloud-based quantum computing service, allowing external users to run experiments on QuiX photonic processors remotely. This cloud access provides a hybrid computing platform and a sandbox for real-world applications in domains such as infrastructure, defense, healthcare, and IT.
On the R&D front, QuiX achieved a breakthrough in photonic entanglement generation: researchers using its integrated photonics technology demonstrated on-chip generation of Greenberger-Horne-Zeilinger (GHZ) states of three photons. A GHZ state is a three-qubit entangled state that serves as a fundamental building block for large entangled structures (so-called resource states) in one-way photonic quantum computing. This result, published in Physical Review Letters in March 2024, was the first-ever demonstration of a heralded 3-photon GHZ state on a chip. The experiment employed a fully programmable low-loss photonic chip manipulating six indistinguishable single photons at telecom wavelength to produce the 3-photon entangled state, heralded (post-selected) by ancillary detections. Achieving this multi-photon entanglement on chip is widely seen as a pivotal milestone for photonic quantum computing, since GHZ or similar states are needed to build the large cluster states required for universal quantum computation with photons. This milestone experimentally confirmed that QuiX’s integrated photonic platform can generate the entangled resources necessary for scaling up quantum circuits, thus reinforcing the feasibility of their long-term hardware strategy.
Looking ahead, QuiX Quantum has a defined roadmap (not as explicitly as some other companies, but in a CEO’s public statement) to transition from today’s prototypes to tomorrow’s powerful quantum machines. With a €15 million Series A funding round secured in mid-2025, the company is now driving toward delivery of its first-generation universal photonic quantum computer by 2026. This 2026 system is planned as an ~8-qubit universal quantum computer implementing a complete quantum gate set – meaning it will be programmable to execute any quantum algorithm (as opposed to the more narrow, special-purpose photonic processors like boson samplers).
Achieving “universality” in the photonic domain will require surmounting long-standing technical hurdles, notably the realization of high-speed feed-forward control and high-quality on-demand single-photon sources – challenges that QuiX has explicitly focused on in this first-generation design. The 2026 machine is thus expected to demonstrate for the first time that a photonic quantum computer can perform arbitrary computational routines, using fast active switching and measurements to enact two-qubit gates in a measurement-based framework.
Following this, QuiX plans a second-generation system in 2027 that will introduce quantum error correction features. In the words of Dr. Hengesbach, “with our first-generation system in 2026, we will demonstrate universality… The next-generation system, planned for 2027, will focus on implementing error correction, a crucial step towards fault-tolerant systems”. Concretely, after proving the basic universal gate operations in the 8-qubit device, the 2027 iteration (possibly on the order of 64 qubits physical) is slated to incorporate the encoding of logical qubits with redundancy to detect and correct photon loss errors.
Beyond 2027, the roadmap envisions scaling toward fault-tolerant photonic quantum computers by the end of the decade – machines that can sustain arbitrarily long computations by actively correcting errors, and that can be expanded modularly to large qubit counts.
QuiX’s ultimate ambition is to provide hardware capable of reaching hundreds of logical qubits (comprising many physical photonic qubits each) within the foreseeable future. According to the CEO, the company is on track with this plan: the near-term focus is delivering the 2026 universal prototype and then progressing toward error-corrected, fault-tolerant architectures, with an eye on scaling up the number of physical qubits significantly to achieve over 100 logical qubits in the future.
In summary, QuiX’s roadmap outlines a fast-paced trajectory from today’s small-scale photonic processors to a large-scale, modular quantum computing network – moving stepwise through demonstrating universality, integrating error correction, and ultimately reaching the fault-tolerance needed for practical quantum advantage.
Focus on Fault Tolerance
Fault tolerance – the ability of a quantum computer to continue operating correctly even when individual qubits or operations fail or suffer errors – is a central long-term goal of QuiX Quantum’s hardware development.
Photonic quantum computing presents a unique blend of challenges and opportunities in the quest for fault-tolerant operation. On one hand, photonic qubits have inherently low environmental noise (a photon traveling through a waveguide doesn’t easily decohere via unwanted interactions) and can be manipulated at room temperature without exotic cooling. This means that certain error sources common in other platforms (e.g. decoherence from material defects or thermal noise) are largely absent or negligible for photons. Indeed, QuiX maintains that quantum error correction may be easier to implement in photonic systems than in modalities like superconducting qubits, because the error landscape is simpler – essentially dominated by photon loss rather than a multitude of noise channels.
The primary error in photonic circuits is the loss or absorption of some photons before they can be detected or interfere. If photon loss can be kept below a critical threshold, error-correcting codes can in principle stabilize the computation.
QuiX’s photonic chips are built on silicon nitride (Si3N4) waveguides, a platform chosen for its exceptionally low optical losses and broadband low-noise properties. This minimizes attenuation of light and helps keep the overall transmission of photons through the circuit above the required threshold for error-corrected operation.
In short, the company’s integrated photonics approach attacks fault tolerance at the hardware level by reducing loss per component (via low-loss materials and efficient design) and by operating at conditions (telecom wavelengths, room temperature) where photon generation and detection can be optimized. That said, reaching fault tolerance will also demand architectural and system-level innovations beyond just low-loss chips.
QuiX Quantum’s strategy for achieving fault-tolerant photonic quantum computing centers on the measurement-based quantum computing (MBQC) paradigm and the creation of large entangled cluster states of photons. In an MBQC scheme, instead of performing sequential gate operations as in a circuit model, one prepares a highly entangled multi-qubit resource state (a cluster) and then carries out computations by measuring individual qubits in the cluster with adaptive choices of measurement basis. This approach is particularly suited to photonics, where entangled “resource states” can be generated offline and consumed during computation. The GHZ state (three-photon entangled state) that QuiX demonstrated in 2024 is an example of a small resource state that can be fused into larger cluster states. In fact, the recent research by QuiX and collaborators explicitly used a fusion-based quantum computation model, which envisions building up large photonic cluster states by fusing many small entangled units like GHZ states. This model has the attractive feature of being loss-tolerant and not requiring feed-forward during the cluster creation phase. However, a critical obstacle was the ability to efficiently generate those small entangled resource states on-chip, an obstacle that QuiX’s experiment overcame by realizing the heralded 3-photon GHZ state in an integrated device.
With GHZ states now achievable on the QuiX platform, the next step is scaling up the number of such entangled photons dramatically. As QuiX’s Chief Scientist Dr. Jelmer Renema put it, “the next challenge is now making many of these [GHZ] devices… one GHZ state is the spark needed to create a blazing fire. The more GHZ states a photonic quantum computer contains, the more powerful it becomes”. In practice, achieving fault-tolerant cluster-state quantum computing will require massive parallelization: generating a stream of entangled photons at a high rate, entangling them into large clusters, and performing fast adaptive measurements. The company’s roadmap reflects this: the first-gen system (2026) is about demonstrating the fundamental capability to do adaptive measurements (feed-forward) on entangled photonic qubits, and the second-gen (2027) is about actively encoding logical qubits with redundancy to correct for losses.
There are several concrete technological conditions that must be met to achieve error correction and fault tolerance in photonic hardware, and QuiX is actively addressing each.
- First, as noted, the overall system loss rate must be below the error-correction threshold so that adding more physical qubits (photons) to form a logical qubit actually improves reliability. This drives efforts in photonic chip fabrication (to minimize propagation loss in waveguides and couplers) and in optical packaging (to minimize coupling losses in and out of the chip). QuiX’s use of Si3N4 PICs, which can operate with propagation losses below 0.1 dB/cm in some cases, is aimed at satisfying this condition.
- Second, the quantum computer must reliably generate a sufficient number of single photons in each clock cycle to supply the error-correcting codes with the needed redundancy. This is a formidable challenge: unlike a matter-qubit device where all qubits exist at once, a photonic device needs to create fresh photons on demand. QuiX recognizes that single-photon sources are a critical piece – the 2026 prototype will incorporate state-of-the-art sources, and further R&D (including possibly quantum dot-based emitters or parametric down-conversion with multiplexing) is ongoing to boost source brightness and reliabiliy. The company has noted that overcoming the long-standing limitations of single-photon sources is part of demonstrating universality in the first-gen machine.
- Third, given those prerequisites, the system must be scalable and modular, meaning one can connect multiple identical photonic computing units to increase the logical qubit count without a prohibitive overhead. Photonics is naturally well-suited to modularity: separate photonic processors (each producing a logical qubit or a cluster state) can be linked via optical fibers to form a larger quantum network. QuiX explicitly envisions connecting modular photonic quantum computers to scale up – an approach far more feasible if each module is room-temperature and compact, as photonics allows. In fact, Dr. Hengesbach remarks that once a single logical qubit module is realized without significant loss, it can be optically connected to neighboring modules to expand the quantum computer, and “thanks to their modularity, [photonic] qubits can then be easily added” in a plug-and-play fashion. Notably, the faster each module operates (i.e. higher clock speed), the more compact the overall system can be for a given throughput – and photonic qubits, traveling at the speed of light, inherently support very fast gate cycles compared to, say, trapped-ion systems.
Thus, QuiX’s philosophy is that in photonics it makes sense to “first significantly increase [qubit] quality and then scale up”. They are focusing on perfecting the hardware for a logical qubit (minimizing loss, improving sources, fast detection and feed-forward) as a precursor to scaling out to many logical qubits. In summary, while true fault-tolerant quantum computing remains a long-term aspiration, QuiX has laid out a plausible path to get there: achieve error rates below threshold via low-loss integrated photonics, implement photonic error correction by creating entangled logical qubits that can survive some photon losses, and then scale modularly by networking multiple photonic modules. Each of these steps is under active development, supported by recent breakthroughs (like on-chip GHZ entanglement) and the planned milestones of the next few years. If successful, this approach could culminate in one of the first fault-tolerant quantum architectures, with photons carrying quantum information robustly through an optical computing fabric.
CRQC Implications
The term CRQC refers to a Cryptographically Relevant Quantum Computer – a quantum machine powerful enough to break present-day cryptographic systems (such as RSA or ECC encryption) by running algorithms like Shor’s factoring algorithm. Achieving a CRQC is widely seen as a major inflection point with broad societal implications, and it typically demands a quantum computer with a large number of reliable qubits (on the order of thousands of logical qubits for breaking RSA-2048, according to many estimates).
While QuiX Quantum’s current prototypes are far from this scale, the company’s development roadmap directly targets the prerequisites for a future CRQC, and its leadership has expressed confidence that the photonic approach can eventually reach cryptography-shattering capabilities. A key step is universality – the ability to run arbitrary quantum algorithms. QuiX’s photonic hardware is being built to support any known quantum algorithm, “including Shor’s and Deutsch’s,” as Dr. Hengesbach noted. The first universal photonic quantum computer (the 8-qubit system planned for 2026) will be too small to factor large numbers, but it will validate the gate mechanisms needed for algorithms like Shor’s. Hengesbach also pointed out that the initial quantum algorithms expected to definitively outperform classical computers might require “just hundreds of logical qubits,” a threshold he believes is achievable in the foreseeable future. This suggests that even before reaching the thousands of logical qubits for full cryptographic break, there may be intermediate milestones (e.g. factoring smaller keys or solving certain problems faster than classical) once a few hundred error-corrected qubits are operational. QuiX’s technology vision aligns with this in that they aim to deliver on the order of 100+ logical qubits in future generations – a scale at which non-trivial algorithms including some cryptographic ones could begin to have impact.
For a photonic quantum computer to become cryptographically relevant, it must achieve fault tolerance and scale in qubit count, as discussed in the previous section. QuiX’s focus on error correction and modularity speaks directly to the requirements of CRQC-class machines. In particular, once error-corrected logical qubits are realized in photonics, scaling up the number of those logical qubits via optical interconnects could be more straightforward than in some other platforms (thanks to the ease of networking photons).
The company’s partnership with DLR also highlights a focus on cryptography: the DLR contract explicitly identifies “post-quantum cryptography” among the target application areas for the photonic quantum computer being developed. This implies using the quantum computer to address cryptographic challenges – possibly both by breaking existing cryptosystems and by vetting quantum-resistant cryptographic algorithms. Defense and security are clearly on the agenda. Hengesbach noted that defense and intelligence use-cases would likely require a medium number of logical qubits to provide value. This category would include breaking encryption or other cryptanalysis tasks that intelligence agencies care about. In contrast, some other applications (like certain chemistry problems) might need fewer, and tasks like exhaustive database search (Grover’s algorithm) might need even more qubits.
The implication is that a photonic quantum computer with a few hundred logical qubits could already threaten some cryptographic schemes or at least demonstrate a clear quantum advantage in cryptography-related problems.
QuiX’s timeline (early 2030s for fault-tolerant machines) intersects with the often-cited horizon when quantum computers might become dangerous to current encryption, underscoring why the company and its partners are interested in CRQC implications. By pursuing a fault-tolerant photonic architecture, QuiX is essentially building toward the kind of scalable, stable quantum hardware that a CRQC would require. If their approach succeeds, one can envision a photonic quantum data-center where modules of optical quantum processors are chained together to create thousands of entangled logical qubits – a scenario in which algorithms like Shor’s could indeed factor large integers and break RSA encryption in a reasonable time.
In summary, while QuiX Quantum’s immediate goals are more modest (tens of physical qubits and demonstration of error correction), the broader ambition is very much cryptography-relevant. The universality of their design ensures compatibility with cryptographic algorithms, and the emphasis on fault tolerance and scaling is paving the way for a future photonic quantum computer that could meet or exceed the CRQC threshold. Their progress will be closely watched by those in cybersecurity: each step towards a larger and more stable photonic quantum processor is a step closer to the day when today’s encryption might yield to quantum attacks. Encouragingly (or alarmingly, depending on perspective), QuiX’s leadership believes that only a few hundred logical qubits – which they aim to attain – could be enough to start outperforming classical machines on important tasks, putting a CRQC within reach once error-corrected photonic clusters can be scaled up to that range.
Modality & Strengths/Trade-offs
QuiX Quantum’s hardware platform is based on the photonic modality of quantum computing, which carries several distinct strengths and trade-offs compared to more established modalities like superconducting circuits or trapped ions.
In a photonic quantum computer, information is encoded in individual photons (or in properties of light fields) that propagate through optical components. These qubits are mobile – they move at the speed of light through waveguides or optical fibers – in contrast to the stationary qubits of matter-based systems that sit in place on a chip or in an ion trap. The use of “flying” qubits means that photonic processors rely on a network of beam splitters, phase shifters, and interferometers to make photons interact (via interference) and to route them as needed.
QuiX leverages integrated photonics, meaning all these optical components are miniaturized onto a chip, to achieve stability and scalability. Their processor chips are fabricated in silicon nitride, a material with excellent optical properties: it supports low-loss waveguides over a broad range of wavelengths and maintains coherence of photons over long distances on-chip. The basic building blocks on these chips are tunable Mach-Zehnder interferometers and phase shifters arranged in a mesh, allowing arbitrary linear optical transformations on dozens of modes. This integrated approach yields phase-stable, reconfigurable interferometer networks far more compact than bulk-optics setups. For example, QuiX’s 20-mode photonic processor contains 66 programmable interferometric cells (132 thermo-optic phase actuators) on a single chip of a few centimeters in size, implementing a 20×20 unitary transformation with high fidelity. Such chip-scale interferometers are essentially the “quantum logic boards” of the photonic computer. One immediate strength of this photonic platform is that all operations occur at or near room temperature – unlike superconducting qubit systems which require dilution refrigerators at millikelvin temperatures. QuiX’s devices can operate in standard data-center environments without special cryogenics. This drastically reduces operational complexity and cost: multiple photonic racks could be deployed in a server room, akin to classical optical networking gear, without the overhead of vacuum chambers or cryostats. As Hengesbach noted, QuiX’s photonic quantum computers are designed for use in data centers “without retrofitting complex liquid helium cooling,” and can achieve high clock speeds since photons naturally propagate fast.
Indeed, among quantum technologies, photonic qubits offer some of the highest potential clock rates, because gates can be as fast as the travel time of light through small circuits (on the order of picoseconds to nanoseconds for chip-scale distances).
Another strength is inherent connectivity: in photonic interferometer networks (especially in free-space or fiber implementations), any qubit can in principle interact (interfere) with any other via appropriate beam splitter configurations. QuiX’s special-purpose boson sampler device, for instance, was essentially a large all-to-all interferometer – “structured like a giant chessboard of intersections” – where 100 input modes would require a 100×100 mesh of beam splitters so that all inputs interfere with all others. While that particular architecture grows in complexity for large N, it illustrates that photonics doesn’t have a hardwired connectivity constraint the way, say, a 2D superconducting qubit array does (which typically only has nearest-neighbor coupling). Photonic qubits can be distributed and routed flexibly using waveguides or fibers, enabling long-range entanglement without additional overhead. This also makes distributed quantum computing and networking more natural – multiple photonic chips can be connected by optical fiber, and photons can even travel between distant nodes with minimal loss, something not possible with material qubits.
In summary, photonic quantum hardware offers scalability in communication (qubits can travel long distances and link modules) and freedom from extreme cooling, which are compelling advantages for building large quantum machines.
The use of light for qubits also brings other practical benefits:
Photonic systems are generally insensitive to electromagnetic noise and do not require isolated magnetic environments or vacuum – the qubits (photons) don’t get easily perturbed by stray fields or gas particles in the lab. The main noise channel, photon loss, can be mitigated by using low-loss optics as discussed, and by operating at telecom wavelengths where fiber and waveguide transmission is optimized. QuiX’s selection of telecom-band operation (around 1550 nm and also exploring ~930 nm for quantum dot sources) ensures compatibility with the vast fiber-optic infrastructure and high-quality detectors available from optical communications.
Additionally, integrated photonics is a mature, commercially scalable technology thanks to decades of development in the telecom and silicon photonics industries. Many of the fabrication processes (lithography, etching, deposition) and components (waveguides, couplers, modulators) used in QuiX’s chips benefit from standard semiconductor industry techniques. QuiX emphasizes that all its components and system designs are optimized from the start for high-volume manufacturing and scalability. Their silicon-nitride PIC platform is patented and designed to be modular and reproducible, potentially allowing mass production of identical photonic processors. This stands in contrast to, say, superconducting qubit fabrication which, while improving, still faces variability and yield issues for large qubit counts.
The photonic chips are also energy-efficient: they require electrical power primarily for thermo-optic phase shifters or fast modulators, but no continuous power to maintain qubit states (photons propagate freely). Combined with room-temp operation, this yields a favorable energy and footprint profile, supporting QuiX’s claim that photonic quantum computers will be “compact, energy-efficient, and easy to maintain” when installed in data centers.
In terms of performance indicators, QuiX’s integrated devices have shown impressive metrics: as noted, a 12-mode universal photonic chip achieved ~98.6% fidelity in implementing arbitrary 12×12 unitary transformations, indicating very precise control over interference and phase. The same device had mean optical loss low enough to be compatible with single-photon experiments (the exact loss was small – the whitepaper suggests only a few dB of total insertion loss). Photon sources integrated or interfaced with these chips (such as quantum dot emitters) can deliver highly indistinguishable photons (~90% indistinguishability demonstrated in leading QD sources), which is crucial for consistent two-photon interference and entanglement.
Overall, the strengths of QuiX’s photonic modality include high stability and fidelity (due to integrated optics), room-temperature operability, potentially fast gate speeds, and a natural path to scaling via networking of modules, all built on a platform amenable to industrial fabrication.
Of course, no quantum hardware platform is without trade-offs and challenges, and photonic systems have their own:
The foremost challenge is the probabilistic nature of photonic two-qubit operations. Photons do not naturally interact with each other strongly; entangling two photonic qubits typically requires them to interfere on a beam splitter and for specific measurement outcomes (such as a particular detector click pattern) to occur. This means many photonic quantum logic operations succeed only probabilistically. For example, the standard approach to a photonic controlled-NOT gate might only succeed 25% of the time (with certain photon detections indicating success) and otherwise needs to be retried. One way around this is measurement-based computation with prepared cluster states, which is exactly the approach QuiX is taking – shifting the probabilism to the state preparation stage. But even in that approach, creating the cluster state itself probabilistically (via fusion gates) can require many attempts or parallel operations to ensure enough successful entanglements each cycle.
Single-photon sources are another trade-off area: unlike a stable atomic qubit that sits ready in a trap, a photonic qubit must be generated at the moment of use. Current sources, whether based on nonlinear optics (parametric down-conversion) or quantum emitters (quantum dots), do not produce photons with 100% success each pulse. State-of-the-art quantum dot sources achieve around 13% single-photon emission probability per pulse (brightness) with about 90% indistinguishability. This means a lot of pulses contain no photon, and many photons from independent sources have slight differences. Multiplexing techniques – using many parallel sources and selecting a successful photon – can raise the effective probability, but at the cost of more complexity. QuiX will need to incorporate such techniques or rely on significant improvements in source technology to feed a large quantum computer. Indeed, scaling to hundreds of photons per cycle with current source efficiency is a major challenge. The company has acknowledged that overcoming single-photon source limitations is a key focus in making their 2026 universal machine work.
Another trade-off is that while photonic qubits don’t decohere in time, they can be lost at various stages. Every coupling (fiber to chip, chip to fiber, etc.), every detector inefficiency, and propagation loss reduces the photon count. Thus, maintaining high fidelity often means working with post-selection (discarding runs where some photons were lost or didn’t arrive). For example, boson sampling experiments inherently rely on post-selecting only those trials where the desired number of photons made it through the interferometer. QuiX’s boson sampler can discard certain measurements to get meaningful results, but a universal computer cannot discard too many trials without losing computational efficiency. This necessitates error correction to actively handle losses, which, as discussed, introduces significant overhead in number of physical photons needed per logical qubit.
Another hardware consideration is the need for high-speed electronics and detectors in the loop. To implement feed-forward (whereby a measurement outcome on one set of photons dictates an operation on another set flying through the circuit), the system requires ultra-fast detectors and optical switches or modulators with sub-microsecond (ideally nanosecond) response times. Integrating these classical control elements with the photonic chip is non-trivial. QuiX has identified fast feed-forward electronics as one of the long-standing challenges it needed to overcome for demonstrating universality. This might involve custom electronics or photonics-electronics integration so that, for example, a photon detection can trigger a phase shift on another waveguide in real time within a single clock cycle. Meeting these speed and integration requirements is a significant engineering effort and could be seen as a trade-off (classical complexity) in the photonic approach.
Additionally, while the quantum photonic chips operate at room temperature, the single-photon detectors often do not. Many high-efficiency single-photon detectors (like superconducting nanowire detectors) need cryogenic cooling. It’s possible to use room-temperature avalanche photodiodes, but at telecom wavelengths these have lower efficiency and can introduce noise. So a full photonic quantum computer may still need some cryogenic components (for detectors) unless room-temp detectors improve. This is sometimes overlooked in the “no cryogenics” advantage of photonics – the qubits and logic don’t need cooling, but today’s best detectors often do. We can view this as a current trade-off: either use slightly lower-performance but warm detectors (sacrificing some efficiency) or incorporate cold detectors (adding complexity).
Finally, the optical routing and coupling complexity grows as systems scale. Each photonic chip might have dozens of fiber connections (for inputs, outputs, source injection, etc.). Packaging a 50-mode device means aligning 50 input fibers and 50 output fibers with micrometer precision – a daunting task, though one that photonics companies like QuiX and partners (e.g. PHIX) are actively working on. QuiX is part of projects like “QuScale” and “Qmode” to develop new interconnects and packaging methods to handle large numbers of channels into and out of photonic chips. Without innovative solutions here, coupling losses and mechanical stability could bottleneck scaling. In essence, the trade-off for photonics’ scalability in principle is complexity in practice when integrating many optical and electronic components.
To summarize, QuiX Quantum’s photonic hardware modality offers compelling strengths – high-speed, high-connectivity, room-temperature operation, low idle error, and compatibility with telecom infrastructure – which make it an attractive route to scalable quantum computing. However, it comes with non-trivial challenges: probabilistic gate operations, photon loss management, source and detector efficiency limits, and integration complexity are all issues that must be addressed for the platform to fully succeed. QuiX’s ongoing research and engineering efforts are essentially aimed at mitigating these trade-offs: developing better single-photon sources (to approach deterministic operation), using error correction to handle losses, implementing ultrafast feed-forward to remove the probabilistic nature at the computational level, and innovating in packaging to support large interferometer meshes. If these challenges can be overcome, photonic quantum computers could leverage their inherent advantages to leapfrog in scalability, fulfilling QuiX’s vision of large, modular quantum processors built from chips and fibers.
Track Record
QuiX Quantum’s track record to date showcases a blend of scientific innovation and commercial execution in the field of quantum photonics. Since its founding in January 2019, the company has consistently pushed the boundaries of integrated photonic quantum hardware. Early on, QuiX focused on developing quantum photonic processors – stand-alone photonic chips acting as linear optical circuits for quantum experiments. In 2020, they launched a 12-mode processor which at the time set a new benchmark for reconfigurable quantum photonic circuits. This was followed by a 20-mode processor in 2022, which was, as noted, the largest universal photonic interferometer demonstrated on a chip. These devices were made available to research labs and earned QuiX a reputation as the supplier of choice for photonic quantum processing hardware in Europe.
In fact, QuiX’s 20-mode photonic processors (a plug-and-play, low-loss, fully programmable interferometer unit) have been adopted as workhorse platforms in multiple leading institutes across the UK, France, Germany, and Hungary. The company’s ability to deliver high-quality hardware made it a market leader in this niche; as of 2022, QuiX claimed that its photonic chips had become “the de facto standard” for photonic quantum computing experiments in several national programs. This is a significant achievement considering the competitive landscape, and it reflects the performance and reliability of QuiX’s technology.
The integrated photonics know-how (originating from the Netherlands’ photonics cluster and UTwente’s MESA+ Institute) provided a strong foundation that translated into a patented PIC technology, touted as “the world’s most powerful Quantum Photonic Processor” by the company.
In parallel with selling processors, QuiX also demonstrated a special-purpose photonic quantum computer in the form of a boson sampler. By integrating one of its large photonic processors with photon sources and detectors, they built a non-universal photonic computer that can perform boson sampling tasks – essentially a photonic network solving a complex interferometric problem faster than classical computers for certain inputs. This not only served as a proof-of-concept for a photonic system, but also as a commercial prototype of a “quantum advantage” experiment (boson sampling is closely related to the Gaussian boson sampling experiments where photonics achieved a quantum computational advantage).
Thus, by the early 2020s, QuiX had already moved from components to small systems, delivering 8- and 64-qubit photonic quantum computers (as they refer to them) to the German Aerospace Center in 2022. These are arguably the first photonic quantum computing systems ever sold. The 8- and 64-qubit devices for DLR are being delivered as part of a multi-year project and mark Europe’s (and the world’s) first sale of universal quantum computers based on photonics. While these initial devices are prototypes under development, the fact that a national research agency invested in QuiX to build such systems speaks to the company’s credibility and track record of innovation.
On the public and scientific front, QuiX’s accomplishments are documented through high-impact publications and announcements. The highlight, as discussed, was the Physical Review Letters publication in 2024 demonstrating on-chip GHZ state generation – a milestone achievement that not only validates QuiX’s technology but also contributes to the broader quantum computing research community.
QuiX team members (including co-founder Dr. Jelmer Renema) have authored numerous papers on integrated quantum photonics, from the design of high-fidelity interferometers to experiments with quantum dot photon sources and thermally reconfigurable circuits. This strong research output underpins the company’s technical claims and keeps it at the cutting edge of photonic quantum science.
Simultaneously, QuiX has been recognized in the industry as a rising star. In 2023 and 2024, the company received an EU EIC Accelerator grant (a prestigious and highly competitive award for deep-tech startups) to support its roadmap. In 2025, it closed a significant Series A financing round (€15M) co-led by Invest-NL (the Dutch national investment fund) and the European Innovation Council Fund. These investments signal confidence in QuiX’s track record and future plans, and they bolster the European photonic quantum computing ecosystem with QuiX at the forefront.
Additionally, QuiX was selected by EE Times for their “Silicon 100” list of top startups to watch in 2025, highlighting its prominence among global semiconductor and quantum technology newcomers.
From a commercialization perspective, QuiX’s strategy of offering cloud access to its photonic processors in 2024 opened its hardware to external developers and early adopters. This move mirrors approaches by larger quantum companies and shows that QuiX is building not just hardware but an user-facing software stack and service (albeit in early form) around it. Users can experiment with photonic circuits via the cloud platform, enabling hybrid quantum-classical algorithms and familiarizing industry with photonic qubits. Such efforts help build a user community and gather feedback, strengthening QuiX’s position as a full-stack quantum provider rather than just a component vendor.
Furthermore, QuiX’s collaboration with DLR and other partners has given it a proven delivery record in a complex multi-stakeholder environment. The DLR project involves joint development and application benchmarking, indicating that QuiX is capable of working closely with end-users to adapt its systems to real-world needs.
The company’s multi-office presence (Netherlands and Germany) and growth from a small team to an international footprint with five offices in Europe also speak to its successful expansion. By maintaining strong ties to academic research (many of its team are from UTwente and other research groups) while executing on product deliveries, QuiX has built a strong track record on both fronts.
In summary, the trajectory from a university spin-off in 2019 to a company delivering Europe’s first photonic quantum computers by 2022-2023, publishing breakthroughs in top journals, and securing major funding by 2025 is a testament to QuiX Quantum’s robust track record. They have consistently demonstrated progress on the hardware (bigger, better photonic chips) and progress on commercialization (sales, partnerships, cloud services). This dual success – combined with an explicit roadmap and continued R&D – positions QuiX as a key player pioneering the photonic route to quantum computing.
Challenges
Despite its impressive progress, QuiX Quantum faces a set of significant challenges on the road to building large-scale photonic quantum computers. Many of these challenges are inherent to the photonic approach and will require further innovation and engineering breakthroughs to overcome.
One of the most pressing challenges is scaling the number of entangled photons by orders of magnitude. As Dr. Renema articulated, going from creating one GHZ state on chip to creating a million GHZ states is akin to going from a spark to a blazing fire. Currently, experiments involve on the order of half a dozen photons; a fault-tolerant universal machine might require thousands or more simultaneously or within a short time window. Generating and managing such a large number of indistinguishable photons is extremely demanding. It places heavy demands on photon sources – they must be efficient and multiplexed enough to deliver many photons in parallel – and on the stability of the interferometric network (which must maintain phase coherence across all those optical paths). Any stray loss or dephasing in a large cluster state could spoil the entanglement. Thus, one challenge is purely scaling the entangled state generation. QuiX’s approach of fusion-based cluster state generation will help, but nonetheless the physical resources (number of sources, number of detectors, number of chip modes) will have to expand significantly.
Packaging so many optical components together without loss becomes challenging; for instance, a large number of fiber or waveguide connections have to be made with precision. The Qmode project that QuiX is part of aims to tackle exactly this: developing new methods to connect large-scale photonic chips to the outside world (e.g., efficient fiber arrays, optical PCBs, etc.). Without such packaging innovations, scaling beyond a few tens of modes could introduce prohibitive coupling losses.
Additionally, sources and detectors integration is a challenge: ideally, one would integrate single-photon sources (like quantum dot emitters or nonlinear waveguides) and single-photon detectors directly on the chip to avoid coupling losses altogether. However, integrating sources and detectors on silicon nitride is non-trivial (silicon nitride is passive; sources may need different materials, detectors often need superconductors). This is an active area of research, but not yet solved at scale. For now, QuiX likely uses off-chip sources (such as laser-pumped nonlinear crystals or external quantum dot modules) and off-chip detectors (fiber-coupled SNSPDs), which means more coupling interfaces and thus more potential loss and complexity.
Another major challenge is achieving deterministic, fast feed-forward control in the photonic circuit. As mentioned, the first universal photonic QC in 2026 is intended to demonstrate the ability to do fast feed-forward – that is, measure some photons and, within the same computational cycle, adjust operations on other photons based on the measurement outcomes. This requires a tight integration of photonics with high-speed electronics. The detection of a photon must trigger an electronic signal that can, for example, reconfigure a Mach-Zehnder interferometer on the fly (to implement a conditional phase shift) before the next photons arrive. Achieving this real-time signal processing is challenging because typical thermo-optic phase shifters on photonic chips have response times on the order of microseconds to milliseconds (limited by heating/cooling rates). Faster electro-optic modulators exist (like Pockels effect modulators or thermo-optic in small heaters), but adding them into a complex chip without increasing loss or cross-talk is an engineering puzzle. QuiX will need to possibly incorporate electro-optic switching elements or use fast optical switches (like lithium-niobate modulators or acousto-optic deflectors) in tandem with the SiN chip to route photons conditionally.
Additionally, the classical logic that processes detector signals must be low-latency (perhaps implemented in FPGAs or ASICs adjacent to the photonic chip). The challenge is essentially to create a hybrid quantum-classical system where the classical part keeps up with the quantum part’s speed. In superconducting qubits, similar challenges exist for real-time feedback (e.g., feed-forward in surface code error correction), but there the speeds are slower (MHz scale) and integrated on the same fridge. In photonics, the raw speed is potentially higher (GHz), making it harder for classical control to keep up unless very specialized hardware is used. Overcoming this will likely require co-design of photonic and electronic circuits – something QuiX is undoubtedly working on as part of its “core building blocks” for fault tolerance. The Series A funding explicitly fuels development of such core blocks, indicating recognition of this challenge.
Photon loss and error correction overhead constitute another intertwined challenge. While SiN waveguides are low-loss, a full photonic quantum computer comprises many components: fiber links, beamsplitters, switches, detectors – each with its efficiency. Losses multiply quickly: for instance, if each of 10 sequential components has 95% transmission, the overall transmission is only about 60%. To implement error correction, one needs multiple photons to encode one logical qubit, which multiplies the number of components and interactions the photons go through. There is a known loss threshold for photonic error correction: if overall transmission per logical qubit drops below that threshold, adding more photons actually reduces the fidelity of the logical qubit rather than improving it. Ensuring the entire system is below the loss threshold is a huge challenge. QuiX’s Hengesbach mentioned that hardware improvements to reduce transmission losses, careful system design, and a reliable supply chain are part of their effort to meet the needed conditions for error correction. This implies working closely with component manufacturers (e.g., ultra-low-loss fiber connectors, high-efficiency detectors, etc.) and optimizing each link in the chain. It’s not just a physics problem but also an engineering and supply chain challenge to get every component at top performance. Even then, the overhead of error correction is expected to be large: one logical qubit might require, say, 50-100 physical photons entangled in a specific code state. So, to do something classically useful, hundreds of logical qubits might mean tens of thousands of physical qubits. Managing that many photons concurrently – generating them, routing them, detecting them – is a formidable challenge that no photonic or otherwise quantum system has yet achieved. QuiX’s modular approach (building smaller units and connecting them) is a strategy to tackle this, but it remains an open challenge how to synchronize and network many modules without loss or decoherence creeping in at the interfaces.
A further challenge lies in the calibration and control complexity of large photonic circuits. A 50-mode interferometer has hundreds of phase shifters that need precise tuning to achieve a desired operation. As the number of modes grows, the calibration of these large meshes can become time-consuming and sensitive to drifts (thermal drifts, etc.). While integrated photonics is quite stable, long computations or long-term operation might require active stabilization of phases. Techniques like self-configuration algorithms (as referenced in the 12-mode paper using methods of Reck or Clements decomposition) help set phases initially, but maintaining them is an ongoing task. This is especially true if the device is to be run in a data center environment where temperature fluctuations or vibrations might be non-negligible. QuiX will need robust feedback control for its interferometers to ensure they remain aligned for potentially hours or days of computation. This is analogous to keeping laser arrays phase-locked in optics – doable, but an added layer of control complexity.
From a broader perspective, one challenge is competition and rapid progress in other modalities. While not a technical hurdle per se, it is a strategic challenge: superconducting qubit systems (by companies like IBM, Google) have already demonstrated >100 physical qubits and are implementing error mitigation and small error-correcting codes, while trapped ion systems have extremely high fidelity gates (99.9% range) albeit at smaller numbers. Photonic quantum computing, especially the universal gate-model kind, is still in earlier stages (with no universal photonic processor beyond a few qubits demonstrated yet, as of 2025). QuiX is essentially racing to show that photonics can catch up and then surpass in scalability. This means the company must not only solve the technical issues but do so quickly to remain relevant. The timeline to 2026 for a universal machine is aggressive, and any delays due to technical challenges could risk leaving photonics behind in the quantum race. However, QuiX’s advantage is that if photonics does work as hoped, scaling to large numbers could accelerate beyond what other platforms can manage, due to the manufacturability and networking strengths discussed. Still, managing expectations and hitting interim milestones (like the 8-qubit universal demo) on time is a challenge in itself for the team.
Lastly, we consider software and algorithm integration challenges. While the question focuses on hardware, it’s worth noting that programming a photonic quantum computer (especially a measurement-based one) requires a different approach than gate-model systems. QuiX and others will need to develop a software stack that can translate high-level algorithms into sequences of photonic source triggers and measurement patterns on a cluster state. Ensuring a full-stack solution (from user interface down to hardware pulse control) is polished will be important for user adoption. This is a softer challenge compared to the physics issues, but it will grow in importance as the hardware matures.
In conclusion, QuiX Quantum faces a convergence of challenges: from scaling up photon numbers and entanglement, to integrating fast electronics, minimizing losses, and handling the complexity of large photonic networks. The company is actively addressing these – through projects like QuScale/Qmode for interconnects through focused R&D on sources and feed-forward, and through careful roadmap planning for error correction. Each challenge is substantial, and overcoming all of them will determine whether photonic quantum computing can reach its promised fault-tolerant scale.