Industry News

A Reality Check on Forbes’ “20 Real-World Quantum Computing Applications”

When a mainstream piece on quantum computing lands, I usually file it under “interesting, but not news”. Yet over the past few days my inbox and Signal chats have been flooded with links to Forbes Tech Council’s new listicle, “20 Real‑World Applications of Quantum Computing to Watch.” Enough colleagues and clients have pinged me for a verdict. Rather than draft twenty individual replies, I’m parking my thoughts here for everyone who pinged me.

Quantum is enjoying a peak‑hype moment, so every bold claim deserves a reality check. A few of Forbes’s examples are genuinely promising; several smash together disparate ideas without noting the engineering road‑map; and some miss key caveats that separate “demo‑ready” from “production‑ready.” The authors could have done a sharper job sorting mature use cases (think quantum key distribution) from long shots (say, universal quantum AI on petabyte‑scale data)—and of explaining the hardware, algorithmic, and economic hurdles in between.

That said, knee‑jerk contrarianism is just as unhelpful as uncritical hype. As I argued in my recent essay on Quantum Contrarianism, pointing out a few overstatements does not prove the entire field is snake oil. The signal sits somewhere between the boosters and the naysayers.

Below is a quick, rundown of key use cases: where Forbes is on-target, where it overreaches, and where the signal is still buried in marketing noise.

Secure Communication and Cryptography

Forbes rightly highlights communication security as a top application. This is valid and already happening – as I often write about, QKD is already enabling provably secure key exchange (e.g., the Micius satellite network) and quantum random generators (QRNG) are bolstering encryption keys. These are real deployments. In this domain, quantum tech is a clear boon for cybersecurity, not just a future hope. We can validate that quantum-secured communication links (whether fiber or satellite) are operational today, protecting diplomatic and financial data against both classical and quantum spying.

Simulations for Autonomous Vehicle (AV) Testing

The Forbes article mentions “simulations for autonomous vehicle testing” – suggesting quantum computers could simulate countless driving scenarios faster. This is related to optimization and Monte Carlo simulation. There is a known quantum algorithm (quantum amplitude estimation) that can quadratically speed up Monte Carlo simulations, which could in theory help sample rare events in AV testing faster. However, this requires a gate-based quantum computer with enough qubits and low error rates to implement amplitude estimation, which is well beyond current NISQ devices. Moreover, simulating an autonomous driving environment involves huge amounts of sensor data (a big data problem). Quantum computers excel at “big compute on small data,” not big data problems. The bandwidth of getting data in/out of a quantum processor is limited, so feeding it millions of driving scenarios (each with high-dimensional state) is a challenge in itself.

In short, while the idea is intriguing – perhaps one day using a quantum computer to evaluate many crash scenarios or edge cases for an AI driver – it is decades premature to list this as a real-world application. It’s more realistic that classical supercomputers and specialized GPUs will handle AV simulations, whereas quantum might contribute to narrower sub-tasks (e.g. optimizing a machine learning model within the AV system, or quickly solving a path-planning problem via quantum search).

Rapid Data Analysis/Big Data

Forbes and other outlets often claim quantum computing will accelerate Big Data analytics for business insights, etc. This is misleading. As mentioned, quantum machines are not well suited to arbitrary large-scale data crunching because input/output constraints mean they cannot ingest massive data any faster than classical in most cases.

Quantum algorithms that do promise speedups (like Grover’s algorithm for search or HHL for linear algebra) generally assume the data is already in quantum memory (qRAM), which is a huge assumption. In practice, loading terabytes of data into qubits would wipe out theoretical speed gains.

Leading quantum researchers have emphasized that quantum computers will likely be used for specific compute-intensive subroutines on relatively small data segments – for example, accelerating a linear algebra step on a matrix of moderate size – rather than replacing your entire Hadoop/Spark cluster for big data.

So, claims of “rapid data analysis” at scale should be taken with a grain of salt. They are overstated unless referring to very particular tasks. A more concrete near-term use might be quantum-assisted data sampling: e.g., using quantum routines to sample from a large dataset according to some distribution more efficiently (quantum sampling can in some cases offer speedups). But again, that’s a niche capability, not a general data processing revolution.

Drug Discovery and Materials Science

This is an application area frequently mentioned (including by Forbes) that is actually on solid footing theoretically, though still in progress. Simulating quantum systems (molecules, materials) is what Richard Feynman originally imagined quantum computers would do, and it remains one of the most promising use cases. Chemistry and materials science simulations exploit the fact that nature is quantum; a quantum computer can, in principle, model chemical interactions exponentially faster than classical simulation. This could lead to breakthroughs in drug discovery (e.g. modeling protein folding or receptor-ligand interactions) and the design of new materials (better batteries, superconductors, etc.).

Unlike some hype use cases, this one has firm evidence: small-scale quantum computers have already simulated simple molecules (like the energy of hydrogen chains, lithium hydride, etc.) accurately. As hardware scales, these simulations will extend to industrially relevant chemicals. The caveat is that to outperform classical chemistry methods, we likely need error-corrected qubits or extremely high-fidelity analog quantum simulators – we’re not there yet. Nonetheless, the consensus in the scientific community is that quantum simulation for chemistry/materials is valid and likely the first area (besides cryptography) where quantum computers will significantly impact the real world. It’s fair to include this in “real-world applications to watch,” but with the understanding that the timeline might be quite a few years for noticeable practical results (e.g. discovering a new drug lead or catalyst with quantum computing assistance).

Quantum for Optimization (e.g. logistics, routing)

Forbes panelists cited things like optimizing supply chains or traffic flows as an area for quantum computing. This is one of the most overstated claims. The reality: while quantum algorithms (like Grover’s search or quantum annealing heuristics) can help certain optimization problems, they do not provide the dramatic speedups people assume for generic operations research problems. The best known quantum speedups for broad optimization are polynomial (e.g. quadratic via Grover, or maybe constant-factor improvements via quantum sampling) – not exponential. Research led by Microsoft’s quantum team found that for many optimization tasks, a quantum algorithm might only outperform classical after reaching unrealistically large problem sizes and only if quantum hardware is fault-tolerant and extremely fast. In fact, Matthias Troyer noted that in the last decade, “many things that people have proposed [quantum could do] don’t work” in practice, often for simple reasons like I/O bottlenecks or error rates. Case in point: D-Wave’s quantum annealers (special-purpose quantum machines for optimization) have been tested on traffic flow optimization and scheduling problems, but so far, classical algorithms typically match or beat them when runtime and solution quality are considered. A Nature paper in 2022 showed no quantum advantage in a D-Wave scheduling benchmark once classical heuristics were optimized. Thus, while research continues (especially on quantum approximate optimization algorithms, QAOA), the claim that quantum computing will revolutionize day-to-day logistics in the near term is unfounded. It remains a long-term possibility if new algorithms emerge.

AI and Machine Learning

Many popular articles (Forbes included) list AI/ML as a domain to be turbo-charged by quantum computing. This is partially true but often oversold for the short term benefit. Personally, I believe the impact of quantum on AI and ML will be massive, but in medium to long term. We do have quantum algorithms (like quantum support vector machines, quantum Principal Component Analysis, variational quantum classifiers, etc.) and some demonstrated small examples where a quantum circuit can do a kernel classification or linear algebra routine effectively.

However, as Scott Aaronson and others have pointed out, “read the fine print” on quantum machine learning algorithms. Often, they assume access to a “quantum RAM” for loading data or they only shave off polynomial factors in complexity that might not outweigh constant-factor slowdowns. For instance, the Harrow-Hassidim-Lloyd (HHL) algorithm can in theory solve certain linear systems exponentially faster, forming a basis for some quantum ML – but that exponential speedup is contingent on ideal conditions (well-conditioned matrices, efficiently loadable state, etc.), and in practice it may devolve to a more modest gain.

So, while there is active research and even some near-term proposals (like using quantum generative models for synthetic data or quantum Boltzmann machines), no one has achieved a clear quantum advantage in ML on real-world data yet. The potential is there, especially for high-dimensional pattern recognition or combinatorial feature selection, but it’s not the case that quantum computers will replace GPUs for AI training in the next few years. A more plausible scenario is hybrid quantum-classical ML: a classical system handles big data preprocessing and most layers of a model, but calls a quantum subroutine for a particularly hard task (say, sampling from a probabilistic model or computing a kernel on exponentially large feature space) which provides a boost. This hybrid vision aligns with expert views that a quantum computer will likely work as an accelerator alongside classical systems, not an outright replacement.

In summary, many proposed quantum applications require careful reality checks. It’s useful to categorize them:

  • Near-term feasible and useful: Quantum cryptography (QKD/QRNG) – already deployed; Quantum-safe encryption (PQC); Possibly certain niche optimization (if problem size is tiny and coherence is enough) – limited cases.
  • Medium-term potential with hardware scaling: Quantum chemistry/material science – likely to show advantage as qubit counts and fidelity improve; Some structured optimization or Monte Carlo acceleration – once error-corrected qubits come, we might see quadratic speedups used in finance risk analysis or logistics, but classical competition is stiff.
  • Long-term or uncertain: General big-data analytics, generic machine learning speedups, solving NP-complete problems at scale – these are either unlikely (NP-complete is believed intractable even for quantum) or require major breakthroughs (both in algorithms and hardware). As one report put it, a wide range of oft-cited applications will “require significant algorithmic improvements” to become practical – we can’t just count on the quantum hardware alone to deliver magic.

Cybersecurity leaders should remain skeptical of grandiose claims (especially on short time horizons) and seek out peer-reviewed evidence or prototype demonstrations. Being optimistic about quantum’s future is fine – indeed, new algorithms might surprise us – but we must base investments and preparations on realistic assessments, not hype.

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven professional services firm dedicated to helping organizations unlock the transformative power of quantum technologies. Alongside leading its specialized service, Secure Quantum (SecureQuantum.com)—focused on quantum resilience and post-quantum cryptography—I also invest in cutting-edge quantum ventures through Quantum.Partners. Currently, I’m completing a PhD in Quantum Computing and authoring an upcoming book “Practical Quantum Resistance” (QuantumResistance.com) while regularly sharing news and insights on quantum computing and quantum security at PostQuantum.com. I’m primarily a cybersecurity and tech risk expert with more than three decades of experience, particularly in critical infrastructure cyber protection. That focus drew me into quantum computing in the early 2000s, and I’ve been captivated by its opportunities and risks ever since. So my experience in quantum tech stretches back decades, having previously founded Boston Photonics and PQ Defense where I engaged in quantum-related R&D well before the field’s mainstream emergence. Today, with quantum computing finally on the horizon, I’ve returned to a 100% focus on quantum technology and its associated risks—drawing on my quantum and AI background, decades of cybersecurity expertise, and experience overseeing major technology transformations—all to help organizations and nations safeguard themselves against quantum threats and capitalize on quantum-driven opportunities.
Share via
Copy link
Powered by Social Snap