Why Quantum Won’t Save Wall Street (Yet): An Honest Assessment of Quantum Computing in Finance
Table of Contents
This article is part of my Quantum Utility Map Deep Dive series. It expands on the finance section of Quantum Computing by 2033 with a detailed examination of the evidence, the resource estimates, and what financial institutions should actually be doing with their quantum budgets.
The pitch and the problem
Every major bank has a quantum computing team. JPMorgan Chase, Goldman Sachs, Barclays, HSBC, Crédit Agricole, and dozens of others have invested hundreds of millions of dollars collectively in quantum research over the past decade. The pitch is compelling: Monte Carlo pricing of complex derivatives, portfolio optimization across thousands of assets, real-time risk computation that currently requires hours on classical clusters. If quantum computers can provide even a modest speedup on these calculations, the financial return would be enormous. The global OTC derivatives market alone is valued at approximately $846 trillion (BIS, mid-2025).
The problem is that the speedup quantum computers offer for these tasks is quadratic, and quadratic is not enough.
This article explains why, what the actual resource estimates say, and what financial institutions should be doing instead. I am writing it because the gap between quantum finance marketing and quantum finance evidence has grown wide enough to be doing real harm: misallocating research budgets, distracting from urgent cryptographic migration, and creating false expectations in boardrooms where the real quantum risks and opportunities are poorly understood.
The quadratic speedup, explained without the jargon
Classical Monte Carlo pricing of a complex derivative works by simulating thousands or millions of random market scenarios, computing the payoff in each scenario, and averaging the results. The accuracy improves with the number of scenarios, but slowly: to halve the error, you need four times as many scenarios. This is the square-root scaling of classical Monte Carlo, and it is the fundamental bottleneck.
Quantum amplitude estimation, the core quantum algorithm for financial Monte Carlo, reverses this relationship. To halve the error, you need only twice as many quantum operations. This is a square root improvement over the classical approach, which sounds transformative. In complexity theory terms, quantum provides an O(1/ε) convergence rate versus classical O(1/ε²), where ε is the target precision.
The catch: this improvement is a constant polynomial factor, not an exponential one. The quantum algorithm does the same fundamental thing as the classical algorithm (sample paths, compute payoffs, aggregate results), just with a more efficient convergence rate. The classical algorithm can be parallelized trivially across thousands of GPUs. The quantum algorithm must run as a single coherent computation on a single quantum processor, with all the overhead of quantum error correction layered on top.
This structural asymmetry (unlimited classical parallelism versus serial quantum coherence) is what makes the quadratic speedup insufficient in practice. I have written about the same structural problem in the context of Grover’s algorithm versus AES encryption, where the quadratic quantum speedup is similarly eroded by error correction overhead and classical parallelism. The finance case is structurally identical.
What the resource estimates actually say
Three papers define the current state of quantum finance resource estimation, and all three were produced by teams at Goldman Sachs and IBM, the institutions with the strongest motivation to find quantum advantage in finance.
Chakrabarti, Krishnakumar, Mazzola, Stamatopoulos, Woerner, and Zeng (Goldman Sachs/IBM, 2021) provided the first complete resource estimate for useful quantum derivative pricing. Using autocallable and TARF (Target Accrual Redemption Forward) derivatives as benchmark use cases, they found that achieving quantum advantage requires approximately 7,500 logical qubits, a T-depth of 46 million, and a logical clock speed of approximately 10 MHz to complete the computation in roughly one second. The one-second target matters because derivative pricing in production environments must keep pace with market data feeds.
Stamatopoulos and Zeng (Goldman Sachs, 2024) applied quantum signal processing (QSP) to encode derivative payoffs directly into quantum amplitudes, eliminating costly quantum arithmetic. This reduced requirements to approximately 4,700 logical qubits and 10⁹ T-gates, with a required T-gate throughput of approximately 45 MHz. A 16× reduction in T-gates and 4× reduction in qubits over the 2021 estimate.
These are impressive algorithmic improvements. They are also still far beyond any projected hardware capability. Current fault-tolerant quantum hardware targets logical clock rates in the low kilohertz range. The finance applications require 10–45 MHz. That is a gap of three to four orders of magnitude.
Put differently: even with the most optimized algorithms available, a quantum computer fast enough to price derivatives competitively with classical Monte Carlo would need to execute error-corrected logical gates roughly 10,000 times faster than any machine projected to exist before 2035.
The three barriers, and why they are structural
Barrier 1: The clock rate gap
Current projections for fault-tolerant quantum hardware place logical clock rates at approximately 10–100 kHz by 2030. Finance applications require 10–45 MHz. This gap cannot be closed by adding more qubits. It is a function of the physical speed of error correction cycles: how fast syndrome measurements can be performed, decoded, and fed back to the quantum processor. Improving clock rates by three orders of magnitude requires fundamental advances in decoder speed, measurement fidelity, and control electronics throughput. These advances may come, but they are engineering challenges distinct from the qubit-scaling progress that dominates quantum computing roadmaps.
Barrier 2: The parallelism asymmetry
A classical Monte Carlo simulation distributes trivially across GPU clusters. A bank running derivative pricing on a cluster of 10,000 NVIDIA H100 GPUs achieves massive parallelism at commodity cost. The quantum algorithm must run as a single coherent computation; you cannot split a quantum amplitude estimation circuit across multiple processors the way you split classical Monte Carlo across GPUs. This means the quantum computer must be faster per-unit-of-computation than the entire classical cluster, not faster than a single classical core. The quadratic speedup must overcome the parallelism of the classical alternative, and at current and projected parallelism levels, it does not.
Barrier 3: The moving classical baseline
Classical finance computation is improving continuously. GPU-accelerated Monte Carlo, AI-driven surrogate models, better variance reduction techniques, and improved numerical methods all erode the quantum advantage over time. The quantum speedup is defined relative to a classical baseline that does not stand still. A quadratic improvement over today’s best classical approach may be a negligible improvement over the classical approach available in 2033, because classical methods will have improved by then as well.
This is a structural difference from quantum chemistry, where the classical wall is fundamental (exponential scaling of quantum system simulation) rather than economic (the cost of more GPUs). In chemistry, no amount of classical hardware can simulate a strongly correlated 50-electron system exactly. In finance, more classical hardware always helps, and the cost per unit of classical compute falls every year.
Portfolio optimization: same barrier, different disguise
Portfolio optimization (mean-variance optimization, CVaR minimization, index tracking) has been extensively promoted as a quantum computing application. The proposed quantum approaches use either QAOA (Quantum Approximate Optimization Algorithm) or quantum annealing to search for optimal portfolios across combinatorial spaces.
The quantum speedup is, again, at best quadratic. And the classical alternatives (mixed-integer programming solvers, heuristic algorithms, classical simulated annealing) are mature, parallelizable, and improving. The largest QAOA demonstrations to date have not matched the performance of classical heuristic solvers on equivalent problem instances.
More concerning: recent theoretical work has shown that for many combinatorial optimization problems, the quantum speedup offered by QAOA vanishes at problem sizes where the optimization is actually hard. The easy instances can be solved classically; the hard instances resist both classical and quantum approaches.
Risk analysis and VaR
Value-at-risk (VaR) computation and stress testing use Monte Carlo at massive scale, making them natural candidates for quantum speedup. The same barriers apply: quadratic improvement, clock rate requirements, parallelism asymmetry. The Bank for International Settlements’ Quantum Readiness Roadmap (July 2025) recommends that systemically important financial institutions support post-quantum cryptography by 2030, but makes no recommendation about quantum computational advantage for risk analysis, because the evidence does not yet support one.
What would change the picture
I want to be direct about this: I am not saying quantum computing will never help in finance. I am saying that the current evidence does not support near-term advantage, and strategic planning should reflect that. Three developments would change my assessment.
A super-polynomial speedup for a financial problem class. If someone discovers a quantum algorithm that provides an exponential (or even cubic) speedup for a specific class of financial computation, the entire analysis above becomes obsolete. This has not happened in 30 years of quantum algorithm research for optimization problems, but the field is active and surprises are possible. The most promising direction is quantum algorithms for specific structured problems (sparse or low-rank covariance matrices, certain stochastic differential equations) rather than generic optimization.
A breakthrough in logical clock rates. If fault-tolerant quantum hardware achieves 10+ MHz logical clock rates through advances in decoder design, measurement speed, or alternative error correction architectures, the clock rate barrier dissolves. Recent advances in algorithmic fault tolerance compress the number of error correction cycles per logical operation, which effectively increases logical throughput. If this trend continues aggressively, the 10 MHz target may become achievable sooner than current roadmaps suggest.
A classical plateau in financial computation. If GPU-based Monte Carlo and AI surrogate models hit fundamental accuracy or scalability limits for complex derivatives, the relative value of even a quadratic quantum speedup increases. This seems unlikely given current trends in classical hardware and software, but it cannot be ruled out.
Until at least one of these developments materializes, quantum advantage in finance remains speculative.
What financial institutions should actually be doing
Priority 1: Post-quantum cryptography migration
The highest-value quantum investment for any financial institution today is post-quantum cryptography (PQC) migration. The reason is straightforward: the cryptographic threat from quantum computing is certain, near-term, and directly relevant to financial services, while the computational advantage is uncertain, far-term, and structurally weak.
The harvest-now-decrypt-later (HNDL) threat is active today. Adversaries with access to encrypted financial data (transaction records, client communications, trading strategies, M&A negotiations) can store that data now and decrypt it later when a cryptographically relevant quantum computer (CRQC) becomes available. For financial data with long confidentiality requirements (decade-plus), this means the window for protection is closing now, regardless of when a CRQC actually arrives.
NIST has finalized the post-quantum cryptographic standards (ML-KEM, formerly CRYSTALS-Kyber; ML-DSA, formerly CRYSTALS-Dilithium; SLH-DSA, formerly SPHINCS+). Regulatory deadlines are already being set: the BIS recommends PQC readiness by 2030 for systemically important institutions. Forget Q-Day predictions; the deadlines are already set.
This is where your quantum budget should go first. Every dollar spent on PQC migration addresses a known, quantifiable risk with a clear regulatory driver. Every dollar spent on quantum finance advantage research addresses an uncertain, structurally challenged opportunity.
Priority 2: Monitor the field, don’t abandon it
Maintaining a small, expert quantum research team makes strategic sense even without near-term advantage. Algorithmic breakthroughs are unpredictable, and the institution that has quantum expertise when a breakthrough occurs will be positioned to exploit it faster than competitors who have to start from scratch. The key is right-sizing the investment: a monitoring and readiness posture, not a production deployment bet.
Track three signals specifically: any peer-reviewed result demonstrating super-polynomial quantum speedup for a financial problem class; logical clock rate demonstrations exceeding 1 MHz; and resource estimate reductions that bring derivative pricing below 1,000 logical qubits (this would suggest a qualitative shift in algorithmic approach, not just incremental optimization).
Priority 3: Quantum-adjacent opportunities
Some financial applications sit adjacent to quantum computing without requiring quantum advantage in finance itself. Quantum-resistant cryptographic infrastructure protects trading systems and client data. Quantum random number generation provides certified randomness for cryptographic key generation. And understanding quantum computing’s impact on other industries (pharma, materials, energy) informs investment analysis, venture capital strategy, and sector allocation decisions.
A bank that understands the Quantum Utility Ladder can make better investment decisions about quantum computing companies, better risk assessments of quantum-exposed sectors, and better strategic advice to clients navigating the transition. That analytical advantage may be worth more than any direct computational speedup.
The honest summary
Quantum computing in finance has been oversold. The quadratic speedup offered by quantum amplitude estimation and related algorithms is structurally insufficient to overcome the combined barriers of clock rate requirements, classical parallelism, and a continuously improving classical baseline. The best resource estimates, produced by Goldman Sachs’ own research team, require hardware capabilities three to four orders of magnitude beyond current projections.
This does not diminish quantum computing as a technology. In chemistry, materials science, and physics, the quantum advantage is genuine, growing, and grounded in exponential classical difficulty. Finance simply faces a different mathematical structure: one where classical methods scale well and quantum methods face overhead that erodes their theoretical edge.
The right quantum strategy for financial institutions is to invest heavily in PQC migration (the certain, near-term threat), maintain a lean quantum research capability (the uncertain, far-term opportunity), and develop the analytical expertise to understand quantum computing’s impact on the industries that will actually be transformed. The quantum revolution in finance will come through understanding quantum’s impact on the real economy, not through pricing derivatives 2× faster.
For the full technical resource estimates underlying this analysis, see The Quantum Utility Ladder. For the broader industry-by-industry competitive analysis, see Quantum Computing by 2033. For practical guidance on starting PQC migration, see Practical Steps to Quantum Readiness and the PQC Migration Framework.
Quantum Upside & Quantum Risk - Handled
My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.

