Microsoft
Table of Contents
(This profile is one entry in my 2025 series on quantum hardware roadmaps and CRQC risk. For the cross‑vendor overview, filters, and links to all companies, see Quantum Hardware Companies and Roadmaps Comparison 2025.)
Introduction
Microsoft’s quantum program is defined by a long‑bet on topological qubits – Majorana‑based devices designed to suppress errors at the hardware level – paired with a near-term, market-facing push through Azure Quantum. Publicly, Microsoft frames its roadmap in three phases: Foundational (prove and control Majorana modes), Resilient (demonstrate error-corrected logical qubits on a small system), and Scale (manufacture at CMOS-like volumes to reach millions of qubits). After years of materials work and device re-design, Microsoft reported the key Foundational steps (Majorana signatures and controllable pairs) and, by 2025, unveiled a “Majorana 1” prototype chip intended to exercise braiding‑style operations.
In parallel, Azure Quantum makes third-party hardware (IonQ, Quantinuum, Rigetti, others) available today, builds the software stack (Q#, resource estimators, error-correction tooling), and forges collaborations (e.g., multi-logical-qubit demonstrations with partners), so that when Microsoft’s own topological hardware matures, an end-to-end platform is already in place.
Milestones & Roadmap
Microsoft’s quantum roadmap can be described in three phases (as per their public materials): Foundational, Resilient, and Scale.
Foundational is about creating the first topological qubit and validating the physics. This took far longer than initially hoped. Microsoft spent well over a decade investigating materials like indium arsenide nanowires with superconductors to host Majorana zero modes. Only in 2018 did they identify and correct a fabrication issue that had led to false signals. By 2021-2022, progress accelerated, and in 2022 Microsoft researchers published evidence of having created and observed Majorana zero modes with the right properties (telltale signatures in conductance experiments). In 2023, they achieved a breakthrough by demonstrating a pair of Majoranas that could be manipulated – essentially the embryonic form of a topological qubit (published in Nature). This finally put Microsoft on the map for having a real qubit device (albeit not yet performing computations).
Entering 2024-2025, Microsoft announced it had built a prototype chip, Majorana 1, integrating the elements needed for a topological qubit and claimed it’s “ready to scale” that design to hundreds, then thousands, then a million qubits. In February 2025, Microsoft stated it had introduced the Majorana 1 processor – likely a test chip that can hold a few topological qubits and perform braiding operations (the fundamental topological gates). This coincided with Microsoft being one of two companies advanced to the final phase of DARPA’s US2QC program, validating their approach to a utility-scale machine by 2033.
The Resilient phase of their roadmap involves building a small-scale quantum computer using topological qubits with error correction. Because each topological qubit is expected to be far more stable, the number needed to create a logical qubit is lower. Microsoft has hinted that once a single topological qubit is reliable, scaling to a few dozen could yield a logical qubit with minimal overhead (maybe 3-5 physical per logical in certain designs). They plan to demonstrate a fault-tolerant logical qubit “in years, not decades” – implying perhaps by ~2027 we could see a rudimentary topologically-protected logical qubit operating.
From there, the Scale phase is to go to millions of qubits by using semiconductor manufacturing techniques to create arrays of nanowires or similar structures. Microsoft’s vision has always been a machine potentially the size of a football field’s worth of hardware integrated, but because topological qubits need less overhead, the effective logical qubit count would be very high.
In parallel, Microsoft’s Azure Quantum platform serves as a bridge: they are already providing quantum computing access (to hardware from partners like IonQ, Quantinuum, Rigetti) and developing a robust software stack (Q#, a quantum programming language, and tools for error mitigation, etc.). This means by the time their own hardware is ready, they’ll have a mature environment for users. Microsoft even introduced Azure Quantum Elements for chemistry and materials simulation using quantum resources, and Azure Quantum for HPC which blends classical and quantum computing workflows. So a short-term milestone is making quantum computing useful through cloud services before MS has its own quantum computer online.
Focus on Fault Tolerance
Microsoft has been fixated on fault tolerance from day one, which is why they chose the topological route (which is all about inherent fault tolerance). They often say a useful quantum computer must be “engineered for fault tolerance,” and they’re not as interested in NISQ algorithms. Their roadmap’s payoff is a machine that more or less directly achieves an error rate low enough to run arbitrarily long circuits. The fact that they categorize phases as “Resilient” and “Scale” shows their aim is to jump quickly to error-corrected operations. Part of DARPA’s evaluation was that Microsoft plans to “build a fault-tolerant prototype in years, not decades”, indicating a concrete prototype could be expected maybe ~5 years out (the exact timeline is undisclosed, but likely before 2030).
It’s worth noting that Microsoft also quietly works on short-term error correction with existing qubits: they collaborate with Quantinuum (e.g., that 12 logical qubit demo on Quantinuum’s hardware involved Microsoft researchers). They also invested in startups like Quantum Circuits Inc (Yale spin-off working on superconducting cat qubits) and PsiQuantum.
This multi-pronged strategy means if topological qubits had failed, Microsoft could pivot or join forces on another modality. Now that topological seems to be bearing fruit, they’re doubling down on it.
CRQC Implications
If Microsoft’s topological approach succeeds, it could be a game-changer for CRQC timelines. Topological qubits promise orders-of-magnitude lower error rates. In theory, a quantum computer made of topological qubits might need significantly fewer physical qubits to factor RSA-2048 than one made of transmons or trapped ions.
Microsoft’s claim of scaling to a million qubits is interesting: they have used that number in context of saying that’s the scale needed for full fault tolerance. If those are topological qubits, a million topological qubits could potentially yield many thousands of logical qubits (because one logical might be like 10-50 topological qubits, compared to 1000+ conventional qubits). With, say, 1,000 logical qubits, RSA-2048 falls, as do many other crypto schemes. So Microsoft’s mature machine (beyond 2030, perhaps 2035) would certainly be a CRQC.
But even before that, once they have, say, 100 topological qubits working, they might attempt smaller crypto-breaking demonstrations. They might, for instance, show that with 50 topological qubits they can break RSA-128 or 256 using much fewer resources than anyone else – a way to prove the power of their tech.
However, given Microsoft’s focus, they might target applications like quantum chemistry first (as Bill Gates alluded to practical uses in 5 years, perhaps meaning materials or drug discovery). Still, the security world is watching: NSA representatives have often listed Microsoft’s topological qubit program as one of the serious but unpredictable threats on the horizon – if it suddenly works better than expected, it could compress the CRQC timeline drastically.
Modality & Strengths/Trade-offs
The topological qubit modality Microsoft pursues involves creating a pair of Majorana zero modes in a nanostructure. These modes form a qubit that is non-locally stored (half the information in one Majorana, half in the other). The chief strength is that certain noise processes can’t easily corrupt both halves at once – thus the qubit is inherently protected from many types of local disturbances. Error rates could be as low as 10-6 or 10-9 per operation if the topological nature perfectly shields the qubit (compare that to ~10-3 for today’s best superconducting gates). Such low error rates would reduce the overhead in error correction dramatically; you might only need to do a little bit of error detection and have maybe a few redundant qubits to handle the rare errors.
Another strength is that braiding operations (exchanging two Majoranas) implement quantum gates in a way that, in theory, are exact and don’t depend on the details of the path – this gives you consistency and potentially simpler control.
Also, topological qubits are still based on superconducting devices (they use superconducting nanowires and magnetic fields), so they can leverage the existing superconducting qubit infrastructure to some extent (same fridges, similar wiring, etc., though with differences).
Microsoft’s approach also integrates with CMOS: they talk about a “compact superconducting topological qubit architecture”, suggesting they are designing qubit arrays that can be made using semiconductor fab methods, which could scale nicely.
Trade-offs: First, the approach only recently yielded a single qubit after a long struggle – it’s extremely complex to engineer. They rely on exotic materials (e.g., a specific semiconductor-superconductor heterostructure, precise epitaxial growth, and operating in high magnetic fields near absolute zero). It’s a low-temperature, delicate system akin to transmons in that sense, but with added complexity of nanowire networks.
Also, while braiding handles some gates, it doesn’t give a full set – some gates (like the equivalent of T-gates for universality) may still need “magic state” distillation or other supplemental methods, meaning topological qubits are not a complete free lunch; some level of error correction on top might still be required for certain operations.
Another challenge: scaling up means fabricating arrays of nanowires with identical characteristics. The semiconductor industry can do a lot, but quantum coherence requirements are strict; variability could be an issue.
Also, controlling these qubits might require many tuned electrostatic gates, microwave pulses, and magnetic fields – the control overhead might be significant even if error rates are low.
Essentially, the trade-off is you solve a lot of the software complexity by building a much more complex piece of hardware. Microsoft’s bet is that investing in the hardware complexity now (making the qubits more stable) is better than having to do extreme error correction in software later. If they’re right, it pays off big; if they’re wrong, they might end up late to the game compared to those who brute-forced with regular qubits and error correction.
Track Record
Microsoft’s quantum program has been long-running (since early 2000s) but for years it was more theoretical. Around mid-2010s, they had some setbacks, including a high-profile retraction of a claimed Majorana observation in 2018. This dented their reputation slightly. However, they didn’t give up: by re-engineering the materials and process, they got results by 2022-2023 that convinced many skeptics. Publishing in Nature and being taken seriously by DARPA confirms their track record is now on an upswing.
In terms of delivering tools, Microsoft has excelled: their Q# language and developer kit are widely used by learners, and Azure Quantum has integrated seamlessly with existing developer workflows in Azure. They might not have built a quantum computer of their own (yet), but they’ve certainly contributed to the ecosystem (for example, they have one of the best quantum error correction simulators and have proposed new error-correcting codes, etc.)
Financially, Microsoft’s deep pockets mean they can sustain a long-term R&D, which they have, even without near-term ROI. CEO Satya Nadella and former CEO Bill Gates both publicly endorse the effort, linking it to Microsoft’s overall cloud and computing strategy. So track record wise, aside from the delay in qubit creation, they’ve been consistent in vision and support. Now that they have a qubit, albeit nascent, the coming track record items to watch will be: do they demonstrate an actual logical operation with two topological qubits? Do they scale to, say, 10 or 20 topological qubits in a device by 2026? Those will be the signs that their roadmap is on schedule.
Challenges
The primary challenge is execution risk: turning those first Majorana qubits into a full processor. They must now do what others have done with transmons/ions – make multiple qubits talk to each other – but in a topologically protected way. Designing a reliable “braiding” mechanism (like a network of nanowires with splitters to exchange Majoranas) is complex.
There’s also a challenge in integrating classical control at scale; Microsoft might need on-chip cryo-electronics to control gates, otherwise wiring up millions of qubits could be impossible (IBM has similar issues at million qubits, Microsoft at million would too). They have published on a concept of a “topological qubit network fabric”, so presumably they have a plan for connectivity and control that’s modular.
Another challenge is that Microsoft has to catch up in practical experience: companies like IBM and Google have years of practice running real quantum computations, dealing with crosstalk, calibration, user requests, etc. Microsoft will be somewhat new to that when their machine comes online (though Azure Quantum gives them insight into how users are trying to use quantum computers).
Additionally, they face competition on all sides now: if they don’t deliver a topological advantage quickly enough, the superconducting/ion trap folks might have solved error correction via brute force. Nadella famously said he sees quantum computing as a fundamental shift and doesn’t want MS to miss out, so the pressure is on.
And an external consideration: while Microsoft focuses on topological, they must not ignore near-term progress; their involvement in DARPA and other collabs suggests they aren’t ignoring it, but there’s a strategic decision of how much to invest in interim approaches vs. all-in on topological. They appear to be bridging with Azure partnerships rather than, say, building a superconducting chip themselves (they did hire some experts and have small efforts with spin qubits and others, but topological remains the main bet).
In short, Microsoft’s roadmap is a moonshot that seems finally to be leaving the launchpad. The next steps will determine if it reaches orbit or not, but if it does, it could change the game.