Fujitsu

Table of Contents
(This profile is one entry in my 2025 series on quantum hardware roadmaps and CRQC risk. For the cross‑vendor overview, filters, and links to all companies, see Quantum Hardware Companies and Roadmaps Comparison 2025.)
Introduction
Fujitsu, a Japanese IT and computing giant, has emerged as a serious player in quantum computing through a multi-pronged strategy spanning cutting-edge quantum hardware and quantum-inspired annealing solutions. The company is pursuing one of the most ambitious quantum roadmaps to date – developing a 10,000+ qubit superconducting quantum computer by 2030 – with a focus on achieving practical advantage using fault-tolerant logical qubits.
Fujitsu’s approach uniquely combines its heritage in high-performance computing (it co-developed the Fugaku supercomputer) with new quantum technologies, all under a “made-in-Japan” initiative. By leveraging both digital annealers (classical CMOS hardware that mimics quantum annealing) and superconducting quantum processors, Fujitsu aims to bridge current computational needs and the future era of true quantum advantage. The result is a broad, hardware-centered program positioning Fujitsu alongside global quantum leaders, but with its own strengths in hybrid HPC integration and a clear goal of building a fault-tolerant quantum computer in the coming decade.
Milestones & Roadmap
Fujitsu’s quantum journey can be traced through a series of notable milestones, reflecting steady progress from quantum-inspired systems to advanced quantum hardware:
2018 – Digital Annealer Launch: Anticipating that practical quantum computers would take time to mature, Fujitsu introduced the Digital Annealer (DA), a quantum-inspired CMOS chip architecture. The first-of-its-kind DA hardware can solve complex optimization problems (formulated as fully connected QUBO models) at speeds rivaling early quantum annealers, but operates at room temperature on conventional silicon. Fujitsu’s 1st-generation DA (2018) featured a 1,024-bit architecture, and it has since evolved through four generations (the 4th-gen launched in May 2022) with increasing problem size (up to 100,000 variables) and speed improvements. This technology saw early commercial adoption – by 2022 Fujitsu had 186 cloud subscriptions worldwide for its annealer services, with deployments in manufacturing, finance, logistics and more. Clients like KDDI used it to optimize telecom networks, and chemical firms used it to accelerate materials discovery. These successes validated Fujitsu’s quantum-inspired approach while true quantum hardware was still nascent.
2020-2021 – RIKEN Partnership and 1000-Qubit Goal: In late 2020, Fujitsu announced a landmark partnership with RIKEN (Japan’s premier research institute) to jointly develop superconducting quantum computers. This led to the April 2021 opening of the RIKEN RQC-Fujitsu Collaboration Center, tasked with developing the hardware and software foundations for a 1,000-qubit superconducting quantum computer. From the outset, the focus was on scalability and error correction – the team aimed to improve qubit fabrication, cryogenic packaging, and error mitigation, ultimately building a working prototype to test applications. Fujitsu’s experience with the Digital Annealer (e.g. in formulating customer problems) was leveraged to guide these developments. The collaboration center’s initial mandate ran through 2025, and it has since been extended to 2029 to continue advancing toward large-scale, fault-tolerant machines.
2023 – First Domestic Quantum Computer (64 Qubits): By October 2023, Fujitsu and RIKEN had successfully developed and deployed a 64-qubit superconducting quantum computer – notably, this was Japan’s second domestically built quantum computer. (The first was a smaller prototype unveiled by RIKEN in March 2023.) The 64-qubit system at the RIKEN-Fujitsu center demonstrated Fujitsu’s ability to design and operate a medium-scale superconducting processor. It was made available via a cloud-based hybrid quantum computing platform that integrates the quantum processor with a powerful simulator, allowing selected enterprises and research institutions to experiment with quantum algorithms from late 2023 onward. This hybrid platform approach – combining a real 64-qubit device with a Fugaku-derived simulator – was aimed at fostering early application development in fields like materials science, finance, and drug discovery while the hardware scales up.
April 2024 – First Commercial Quantum System Sale: In a significant sign of confidence, Fujitsu received Japan’s first order for a domestic quantum computer in 2024. The National Institute of Advanced Industrial Science and Technology (AIST) placed an order for a Fujitsu-made gate-based superconducting quantum system, to be delivered and operational in early 2025. This marked the first time a Japanese vendor sold a quantum computing system. The machine, to be hosted at AIST’s new Quantum/AI center (G-QuAT), is designed to support “hundreds of qubits” and will be integrated with AIST’s classical AI-Bridging Cloud Infrastructure (ABCI). The system leverages technologies from the RIKEN collaboration (including a novel high-density wiring scheme that allows scaling qubit count without a larger dilution refrigerator). Fujitsu and AIST envision this as a platform for research on quantum-AI hybrid computing and a testbed for industry use cases, underlining Fujitsu’s move from lab R&D into practical deployment.
April 2025 – 256-Qubit World-Leading Processor: Fujitsu and RIKEN announced the development of a 256-qubit superconducting quantum computer, one of the highest-qubit devices of its kind at that time. This 256-qubit system builds directly on the 64-qubit design: Fujitsu employed a modular “unit cell” architecture (4 qubits per cell) with a unique 3D chip layout, allowing them to quadruple the qubit count without redesigning the qubit itself. Clever thermal engineering and high-density packaging ensured the 256 qubits fit in the same cryostat used for the 64-qubit machine, overcoming cooling and wiring challenges. This accomplishment demonstrated the scalability of Fujitsu’s design and was hailed as a new step toward practical superconducting quantum computers. The 256-qubit processor enables experiments with more complex algorithms and quantum error correction techniques (using many qubits for encoding) that were not possible on the 64-qubit device. The company also shared a roadmap to launch a 1,000-qubit system by 2026, applying the same modular scaling approach, and ultimately to continue doubling or more on the path to 10,000+ qubits by 2030.
August 2025 – 10,000+ Qubit Early-FTQC Project: Fujitsu formally announced it had begun development of a 10,000+ qubit superconducting quantum computer, targeting completion in fiscal 2030. Unlike many headline-grabbing qubit milestones, Fujitsu’s emphasis here was on logical qubits and fault tolerance. The planned machine is expected to deliver ~250 logical qubits (error-corrected qubit units useful for algorithms) out of those physical qubits. To achieve this, it employs Fujitsu’s novel STAR architecture (details in the next section) for early fault-tolerant computing. This project, backed by Japan’s NEDO funding agency, is not just about hitting a qubit count but about creating a practical, resilient quantum system. It involves parallel R&D in qubit fabrication, chip interconnects, cryogenic control electronics, and error decoding algorithms. Fujitsu’s CTO declared this initiative aims to produce a “made-in-Japan fault-tolerant superconducting quantum computer,” integrating the best of superconducting qubits with complementary technologies like diamond-spin qubits by the mid-2030s. The roadmap beyond 2030 indeed envisions hybrid qubit systems (superconducting + diamond) and scaling to 1,000 logical qubits by 2035.
Throughout this timeline, Fujitsu’s strategy shows a clear pattern: start delivering value with interim technologies (like digital annealers and small quantum prototypes), while executing on a long-term roadmap toward large-scale, fault-tolerant quantum computing. Next, we examine how Fujitsu is addressing the central challenge of quantum computing – achieving fault tolerance – which underpins its roadmap.
Focus on Fault Tolerance
Fault tolerance is at the heart of Fujitsu’s quantum R&D, recognizing that only error-corrected quantum computers can reliably tackle real-world problems. Fujitsu’s approach to fault tolerance is embodied in its proprietary STAR architecture, which stands for Space-Time Analog Rotation – an early-FTQC (early Fault-Tolerant Quantum Computing) design developed with researchers at Osaka University. The STAR architecture introduces innovative methods to reduce the overhead of quantum error correction: notably, it can perform arbitrary phase-rotation operations in an analog manner that dramatically cuts down the number of physical qubits needed for a given algorithm. Traditional quantum error correction (e.g. surface codes) might require hundreds or thousands of physical qubits to encode 1 logical qubit, but Fujitsu has reported that STAR could achieve similar computational accuracy with only ~10% of the qubits that conventional schemes would need. In concrete terms, an algorithm that might normally demand on the order of one million physical qubits for full error-corrected execution could be run with tens of thousands of qubits under the STAR methodology. This is a game-changer because it brings the required scale of hardware into the near future – Fujitsu demonstrated theoretically that a useful quantum advantage (e.g. a materials science calculation) could be achieved with ~60,000 physical qubits and clever error mitigation, instead of the million-plus qubits often cited in the literature.
To achieve these savings, STAR incorporates two key innovations. First, it improves the precision of qubit rotations – Fujitsu and Osaka University developed techniques to perform phase rotation gates with extremely high accuracy, minimizing the incremental errors that accumulate in long algorithms. Second, STAR uses an automated quantum circuit generator that optimizes the sequence of physical operations for a given logical computation. This software automatically finds efficient gate implementations and can dynamically adjust operations to correct for errors on the fly, reducing the error impact overall. These techniques, announced in 2023-2024, were validated through simulations showing that a quantum computer could solve a problem in 10 hours that would take a classical computer 5 years – using 60k qubits and early-stage error correction. In other words, quantum advantage in the early-FTQC era (~2030) is attainable, provided error rates are managed via such architecture improvements.
This result is one of the clearest roadmaps to practical quantum computing in the near term, since it bypasses the need for a fully error-corrected (FTQC) machine and instead strives for a semi-tolerant system with acceptable error rates. Fujitsu’s plan to realize 250 logical qubits by 2030 is rooted in these advances. Each logical qubit will still consist of many physical qubits with error correction, but far fewer than earlier approaches required. By 2035, Fujitsu aims for 1,000 logical qubits, indicating a further maturation of their error correction (possibly moving from “early-FTQC” to more fully fault-tolerant regimes).
Beyond architecture, Fujitsu is tackling fault tolerance through collaborations and the full stack. They’ve engaged quantum error-correction experts – for example, partnering with Quantum Benchmark/Keysight for error mitigation techniques and with academic leaders like Prof. Keisuke Fujii (Osaka) on novel error correction theories. Error mitigation experiments are already being conducted on Fujitsu’s prototype hardware: the RIKEN collaboration ran quantum chemistry algorithms combined with error mitigation on the 64-qubit machine to test accuracy improvements. Fujitsu also participated in IEEE Quantum Week 2024, sharing progress on error correction and demonstrating small-scale error detection on their qubits. On the materials side, they are improving qubit quality – refining fabrication to reduce variability and losses – since higher native fidelity qubits mean fewer errors to correct. Indeed, the NEDO-backed 10,000-qubit project explicitly targets advancements in qubit manufacturing precision, cryogenic control electronics, and error decoding algorithms, all of which underpin a fault-tolerant system. In summary, Fujitsu’s heavy focus on fault tolerance – from the STAR architecture to partnerships in error mitigation – shows a commitment to useful quantum computing, not just qubit counts. This focus will directly influence how soon quantum computers become powerful enough to threaten classical cryptography, which we discuss next.
CRQC Implications
The advent of a Cryptographically Relevant Quantum Computer (CRQC) – one capable of breaking present-day encryption like RSA – is a concern for governments and industry. Fujitsu’s work provides an informed perspective on this timeline. In early 2023, Fujitsu researchers used their 39-qubit quantum simulator (leveraging technology from the Fugaku supercomputer) to assess the resources required for Shor’s algorithm (the quantum factorization algorithm) to crack RSA encryption. The findings reinforced that a large-scale, fault-tolerant quantum computer is needed for such a task: approximately 10,000 physical qubits with 2.23 trillion quantum gates, running continuously for about 104 days, would be required to factor a standard RSA key. This is orders of magnitude beyond the capability of today’s devices, which have at most a few hundred noisy qubits and can execute only thousands of gates before errors dominate. In other words, Fujitsu’s study concluded that the quantum decryption threat to RSA remains distant – likely years away – under current technological trends. Even specialized quantum approaches like annealing are not close to this: there is ongoing debate about whether near-term NISQ devices or quantum annealers could attack encryption, but Fujitsu’s stance is that without error-corrected qubits, such feats aren’t feasible.
This research highlights two implications. First, it underscores why Fujitsu is prioritizing fault tolerance. Only by drastically improving qubit counts and error rates (as in their 2030/2035 roadmap) can one even approach CRQC-level capabilities. Their focus on logical qubits is directly aligned with the requirements for cryptography-breaking algorithms, which demand long, precise quantum circuits. Second, Fujitsu’s results support global calls to transition to post-quantum cryptography (PQC) well before a CRQC materializes. The company’s findings were publicized as a reminder that while a CRQC is not imminent, the prudent course is to prepare in advance (since data can be stolen now and decrypted later when a quantum computer becomes available). Japan, like other nations, is moving to adopt PQC standards, and Fujitsu as a leading IT provider is likely involved in supporting this migration (for instance, by securing its cloud services and customer solutions against future quantum threats). Internally, Fujitsu’s dual role is clear: build the quantum computers that push the envelope, but also help protect data from the very machines they’re building, until new cryptographic safeguards are in place. As of 2025, with Fujitsu’s best quantum machines in the few-hundred-qubit range, breaking RSA is far off, giving a window of opportunity to implement PQC. Fujitsu’s measured view – that RSA-cracking will require fault-tolerance and millions of operations – provides reassurance that current efforts in error correction must succeed before any CRQC emerges. This aligns with global expert consensus and adds impetus to Fujitsu’s quest for fault-tolerant architectures.
Modality & Strengths/Trade-offs
Fujitsu’s quantum portfolio spans multiple computing modalities, each with its own strengths and trade-offs. However, the company’s main bet is on gate-based superconducting qubits, complemented by quantum-inspired annealing and exploratory research into alternative qubit types. Here we break down the modalities Fujitsu is investing in:
Superconducting Gate Quantum Computers
This is Fujitsu’s primary focus for realizing universal quantum computing. Superconducting qubits (implemented as tiny Josephson junction circuits on chips) are attractive for their fast gate speeds and relative maturity – they’re used by leaders like IBM and Google as well. Fujitsu’s superconducting qubits operate in milli-Kelvin temperatures inside dilution refrigerators.
A key strength of Fujitsu’s approach is its scalable architecture: by using a 3D integrated “unit cell” design, they can tile qubit modules to increase qubit count without fundamentally changing the chip design. This modularity, proven in scaling from 64 to 256 qubits, suggests the design can grow to thousands of qubits (with improvements in packaging and interconnect).
Another strength is Fujitsu’s emphasis on high-density wiring and advanced cooling design, which allowed quadruple the qubits in the same fridge volume.
The trade-offs for superconducting qubits are well-known: they require ultra-low temperatures and elaborate control systems, and qubit coherence times are limited, necessitating error correction for long computations. Fujitsu is addressing these by developing better materials (to reduce qubit noise) and optimizing cryogenic electronics. The STAR early-FTQC architecture further mitigates the trade-off of high qubit overhead for error correction by significantly reducing the number of physical qubits per logical qubit.
In essence, Fujitsu is doubling down on superconducting qubits, betting that its engineering innovations and error-correction schemes will make this modality viable for large-scale, general-purpose quantum computers in the 2030s.
Digital Annealer (Quantum-Inspired CMOS)
In parallel, Fujitsu’s Digital Annealer represents a non-quantum modality that is “quantum-adjacent.” It is a classical piece of hardware, built with conventional silicon CMOS, but it’s designed to emulate quantum annealing processes for optimization problems. The Digital Annealer can be viewed as a specialized co-processor: it contains thousands of “bit” units and interconnections that mimic the energy minimization of an Ising-model or QUBO problem, effectively performing a parallel stochastic search for optimal solutions. The strength of this approach is that it achieves quantum-like speedups on certain combinatorial problems today, without needing any qubits or cryogenics. It runs at room temperature, in standard data centers, and can scale to very large problem sizes (100k variables) beyond what current quantum annealers (like D-Wave’s) can handle due to noise or connectivity limits. Fujitsu has leveraged this strength by deploying Digital Annealers in industries ranging from portfolio optimization to factory scheduling, delivering practical value while true quantum computers are still limited.
The trade-off, of course, is that no quantum effects (like superposition or tunneling) are actually present – the Digital Annealer is ultimately executing a cleverly crafted classical algorithm in hardware. This means it cannot solve all problem types that a universal quantum computer could, and some problems might eventually be solved faster by real quantum annealers once those mature (quantum parallelism could theoretically outpace any classical device on certain problems). Fujitsu acknowledges that quantum annealers “will surpass the Digital Annealer in the long run because of their super-massive quantum parallelism. Nonetheless, as a bridge technology, the Digital Annealer has been extremely valuable. It allowed Fujitsu to develop expertise in formulating optimization problems and quantum-inspired algorithms, cultivate a software ecosystem, and engage customers early. This experience feeds into their quantum efforts – many use cases tackled on the Digital Annealer today (e.g. supply chain optimization, drug molecule similarity search) are exactly those that future gate-model or annealing quantum computers aim to accelerate further.
Other Modalities
Fujitsu is also exploring other qubit modalities in research collaborations. One notable area is diamond spin qubits: Fujitsu has partnered with quantum labs (like Osaka University and TU Delft) to investigate qubits based on nitrogen-vacancy centers in diamond. Diamond spin qubits can have extremely long coherence times, even at higher temperatures, making them attractive for quantum memories or networked qubits. Fujitsu’s long-term plan includes integrating diamond spin qubits with superconducting qubits to harness the advantages of both. For example, superconducting qubits could provide fast processing, while diamond spins could serve as robust storage of quantum information or as interconnect qubits between chips. The trade-off here is that diamond qubits typically have slower gate operations and are harder to scale in number, so they may not replace superconductors but rather complement them.
Fujitsu is also keeping an eye on photonic quantum computing and other emerging modalities (“other methods” are being explored in early research), but there haven’t been major public projects in those areas from Fujitsu yet. Overall, Fujitsu’s modality strategy is heterogeneous but with a clear prioritization: advance superconducting qubits to large scale, use digital annealers to capture value now, and investigate secondary qubit tech (like spins) to future-proof the architecture. This diversified approach plays to Fujitsu’s strengths in hardware and systems integration, while hedging against the risk that any single quantum tech could hit a roadblock.
In summary, Fujitsu’s main strength lies in its holistic perspective – it is one of the few companies covering the entire spectrum from classical HPC and quantum-inspired chips to cutting-edge quantum processors. The trade-off of such breadth is the need to maintain expertise across domains, but Fujitsu has managed this by leveraging partnerships (RIKEN for superconducting physics, university labs for new qubit science, etc.) and its own rich R&D infrastructure. As we’ll see next, this comprehensive approach has translated into a solid track record of achievements and positions Fujitsu well in the quantum race.
Track Record
Fujitsu’s track record in quantum computing is characterized by steady, credible progress and industry engagement, rather than flashy one-off demonstrations. By building on decades of computing experience, Fujitsu has accumulated several notable achievements:
Quantum-Inspired Industry Impact: Since launching the Digital Annealer in 2018, Fujitsu has delivered real-world optimizations for clients. Over 30+ press-announced use cases were completed by 2022, including collaborations where the DA solved problems that were previously intractable. For example, Fujitsu worked with KDDI to optimize radio tower parameters for better mobile signal quality, and with materials company Showa Denko to dramatically speed up the search for new semiconductor materials configurations. The Digital Annealer business gained global traction (users in Japan, North America, Europe) and proved the concept of quantum-inspired computing as a value-add. Fujitsu’s ability to integrate the DA into cloud services (accessible via APIs and software SDKs co-developed with 1QBit) showcased its strength in delivering complex tech in a user-friendly way. This early success not only generated revenue but built a community of quantum-aware customers who are likely to transition to full quantum solutions when ready. It positioned Fujitsu as a pioneer in quantum-inspired optimization, frequently cited alongside D-Wave in that domain.
Supercomputing and Simulation: Fujitsu’s pedigree in HPC (High-Performance Computing) is a major asset to its quantum track record. The company co-developed the Fugaku supercomputer (which was ranked #1 globally in 2020) and has deep expertise in large-scale computing systems. Fujitsu leveraged this by creating a 39-qubit quantum circuit simulator using Fugaku’s technology. This simulator achieved world-leading performance in simulating quantum algorithms and was crucial in research like the RSA study. In 2024, Fujitsu also built “ABCI-Q”, a quantum-AI cloud platform combining classical AI infrastructure (ABCI) with access to quantum resources. These endeavors demonstrate Fujitsu’s practical understanding of hybrid computing: they are not treating quantum in isolation but as part of an ecosystem with classical computing. The company even received an award (with research partners) in March 2024 for developing a high-performance hybrid computing platform that integrated Japan’s superconducting quantum computer with classical HPC. This recognition underscores Fujitsu’s leadership in merging quantum and classical workflows, which is critical for near-term quantum applications.
Successful Hardware Prototypes: Unlike many competitors that are still at the theory or small-demo stage, Fujitsu has designed, built, and operated multiple working quantum processors. The 64-qubit and 256-qubit superconducting chips developed with RIKEN stand as concrete proof of Fujitsu’s hardware engineering capabilities. Moreover, these weren’t lab-only devices – the 64-qubit machine was opened to external users (Japanese companies and research groups) in a testbed capacity starting 2023. This early user access allowed Fujitsu to gather feedback on real quantum algorithms and to begin co-developing quantum solutions with industry (e.g., in finance and materials science). The 256-qubit system further cements Fujitsu’s record, as at the time of its announcement it was one of the largest gate-model quantum computers globally available for use. The ability to scale 4× in one year demonstrated excellence in project execution. Additionally, Fujitsu’s hardware work has been validated by external orders – the sale of a system to AIST (for 2025 delivery) shows that their quantum tech is commercial-grade, a huge milestone for any provider. It made Fujitsu the first Japanese company to commercially supply a quantum computer, highlighting trust in their engineering from Japan’s scientific community.
Innovation and IP Generation: Fujitsu has been actively filing patents and publishing research across the quantum stack. The STAR architecture and analog rotation methods were documented in joint papers with academia. Fujitsu’s researchers (e.g., Dr. Shintaro Sato who heads Fujitsu’s Quantum Lab) are often keynote speakers at quantum technology conferences, indicating peer recognition. The company also bagged a prestigious accomplishment by demonstrating one of the first instances of quantum advantage in theory for a useful problem (the 10-hour vs 5-year analog rotation result). This kind of breakthrough bolsters Fujitsu’s reputation as a thought leader in quantum algorithms, not just hardware. Fujitsu has also made strides on the software side by developing a full stack: from low-level control systems (they use a Keysight quantum control system for their 256-qubit chip) to middleware and cloud interfaces. Having control over the stack means Fujitsu can fine-tune performance and integrate improvements quickly – a track record advantage that vertically integrated efforts enjoy.
Ecosystem Building: Fujitsu’s track record isn’t just about technology, but also about building a quantum ecosystem in Japan. By collaborating with RIKEN, AIST, universities (Osaka, Tokyo, Delft), and startups (QunaSys for quantum chemistry software, 1QBit for annealer software), Fujitsu plays a central role in a network of innovation. The extension of the RIKEN collaboration center to 2029 shows the partnership’s success and the confidence to continue long-term. Fujitsu is also contributing to workforce development: it has a dedicated quantum research division and has hired young talent (such as 30+ new graduates in India for quantum and AI R&D as noted in late 2024). By engaging in public outreach (e.g., posting “quantum computing strategy” on Fujitsu’s website and running the Fujitsu PR note blog to share behind-the-scenes stories), they are demystifying quantum tech for a broader audience. All these efforts enhance Fujitsu’s track record as a leader who not only builds hardware but also cultivates the environment needed for quantum computing to thrive.
Taken together, Fujitsu’s achievements show a company that delivers on its promises step by step. They haven’t claimed quantum supremacy or made sensational announcements; instead, they consistently hit technical milestones (qubit counts, new architectures), apply their technologies in practice, and win the trust of research and industry partners. This solid foundation will be crucial as Fujitsu faces the many challenges that lie ahead on the road to a large-scale quantum computer.
Challenges
Despite its successes, Fujitsu – like everyone in the quantum race – faces significant challenges and uncertainties in the quest to realize practical quantum computing. Some of the key challenges include:
Scaling Physical Qubits: While Fujitsu’s 256 qubit device is impressive, moving to thousands of qubits will test the limits of fabrication and engineering. Superconducting qubits require pristine materials and uniformity; even tiny variations can introduce noise. Fujitsu must achieve semiconductor-like manufacturing precision for quantum chips, which is non-trivial. Increasing qubit count also means more control wiring and microwave lines into the cryostat. Fujitsu managed a 4× wiring density increase from 64 to 256 qubits by meticulous design, but going to 1000+ qubits may demand novel solutions like chip-level multiplexing or photonic interconnects to avoid a “wiring bottleneck.” The company is actively researching 3D packaging and chip-to-chip communication to tackle this, yet it remains a big challenge. The 1000-qubit goal by 2026 will likely require multi-chip modules or larger fridges; ensuring qubits on different chips stay entangled/coherent is an open technical hurdle.
Maintaining Coherence and Reducing Error Rates: As systems grow, keeping qubits stable (coherent) throughout long computations becomes harder. Fujitsu’s approach of early fault tolerance (STAR) helps mitigate this by correcting errors on the fly, but it still needs high fidelity qubits and gates to start with. Achieving the error rates needed for effective error correction (e.g. gate error <0.1% for surface codes) is extremely challenging. It demands improvements in materials (to reduce loss and noise in superconducting circuits) and in shielding against external noise. Fujitsu will need to iterate on qubit design – possibly moving to larger qubits or different designs to improve coherence. Even with STAR, their estimate for a useful early-FTQC was ~60,000 qubits, implying error rates that still necessitate tens of thousands of qubits for one problem. Pushing those error rates down by an order of magnitude is a continual battle. This challenge is why Fujitsu is also exploring diamond spin qubits (which have longer native coherence) for future integration. However, merging two very different qubit technologies (superconducting and spin) by 2035 is itself a formidable task, both in terms of hardware interface and software control.
Talent and R&D Bandwidth: Building a fault-tolerant quantum computer is a massive interdisciplinary effort. Fujitsu must attract and retain top talent: quantum physicists, microwave engineers, cryogenic experts, software developers, and more. Competition for quantum experts is global and fierce – companies like Google, IBM, Intel, as well as many startups, are vying for the same pool of PhDs and engineers. Fujitsu’s collaborations with academia help tap into new graduates, but it will need to keep expanding its team as projects scale up. There is also the challenge of managing R&D across different modalities (quantum, classical, spin, etc.). Ensuring the Digital Annealer team, the superconducting qubit team, and others cross-pollinate their insights without being siloed will be important for Fujitsu’s success. So far, the company’s matrix of partnerships (RIKEN, universities, etc.) has helped share the load, but coordinating everyone toward the singular goal of a large FTQC by 2030 will become more complex as the effort grows.
Global Competition and Market Uncertainty: Fujitsu’s timeline puts it in direct competition with IBM (which has announced plans for a 1000+ qubit machine by 2023 and aims for >4000 qubits by 2025 in its roadmap) and Google (which is researching its own large-scale error-corrected prototypes). By focusing on logical qubits and fault tolerance, Fujitsu has differentiated its strategy – aiming for practical usefulness rather than just raw qubit numbers. However, if competitors achieve breakthroughs (e.g. a new type of qubit or error-correction method), Fujitsu could be leapfrogged. There is also competition from other Japanese entities: e.g., IBM Japan’s Quantum System One installed in Kawasaki, and research by universities (Tokyo Univ. is working with IBM, NTT is exploring photonic quantum computing, etc.). Fujitsu needs to maintain a leadership role in Japan’s quantum initiative to secure continued government support and industry buy-in. On the market side, the demand for quantum solutions is still emerging. Fujitsu has cultivated early adopters via its annealer and hybrid platform, but widespread adoption will depend on quantum computers actually outperforming classical methods on valuable problems. The timeline for this is uncertain and could slip if technical progress stalls. Fujitsu and others also have to manage expectations – overhyping can lead to a “quantum winter” of disillusionment. So far, Fujitsu has been fairly realistic (emphasizing that fault-tolerant quantum computing is a long game), but it must continue to deliver progress to keep stakeholders convinced of the ROI.
Bridging Today’s and Tomorrow’s Tech: Fujitsu’s unique challenge is transitioning its Digital Annealer clients to quantum computers eventually. The DA and quantum processors are very different beasts – one is essentially a specialized CMOS accelerator, the other a fragile quantum machine. Ensuring that expertise and software developed for the annealer can transfer to gate-model quantum computing is not straightforward. Fujitsu will have to provide robust software tools to map optimization problems from the Digital Annealer format to their quantum gate-model platform when the time comes. They are already thinking ahead, as evidenced by the hybrid algorithms they intend to run (classical and quantum working together). Another bridging challenge is integrating Fujitsu’s quantum hardware with classical HPC workflows in a seamless cloud service. This involves a lot of software engineering (queueing jobs, error handling, etc.) which is often under-appreciated but crucial for user adoption. Fujitsu’s experience in cloud services and its existing quantum cloud platform give it a leg up here, but scaling that platform from a handful of users to potentially many customers by 2030 will require reliability and user support at a level typical of enterprise IT services – a new paradigm for quantum computing offerings.
In summary, Fujitsu must navigate technical, human, and competitive challenges on its path forward. Scaling up to a fault-tolerant quantum computer is not guaranteed – unforeseen physics issues or engineering roadblocks could emerge as qubit counts increase. Nonetheless, Fujitsu has shown a proactive stance toward these challenges: it is investing in next-generation technologies (like new qubits and interconnects) to mitigate scaling issues, collaborating broadly to access talent and knowledge, and staying focused on the end goal of useful, error-corrected quantum computation. The next few years – as they attempt to go from 256 qubits to 1000+ and demonstrate early fault-tolerant operations – will be critical. If Fujitsu can overcome these hurdles, it will solidify its position as a global quantum computing leader.