Quantum Security & PQC News

BIS Project Leap Phase 2 – PQC in Real-World Payment Systems

13 Dec 2025 – The Bank for International Settlements (BIS) and the Eurosystem published the results of Project Leap Phase 2, a massive technical trial testing PQC on the TARGET2 payment system (the Real-Time Gross Settlement system for the Euro). The project involved the Banque de France, Deutsche Bundesbank, the Bank of Italy, and Swift.   

The headline finding was a success: the consortium proved they could functionally send PQC-signed liquidity transfers between central banks. However, the technical details buried in the report are important. The report explicitly states that “Post-quantum signature verification took meaningfully longer than traditional RSA-based verification” and that the system suffered from packet size issues that required substantial redevelopment.   

Technical Deep Dive: The Physics of Latency

Project Leap tested the “Hybrid” approach – running PQC and RSA side-by-side to ensure backward compatibility and defense-in-depth. They found that legacy payment architectures (specifically the ESMIG connector used in TARGET2) were “ill-prepared” for the realities of hybrid cryptography.   

The Byte-Size Problem: The fundamental issue is the size of the keys and signatures.

  • Legacy RSA: An RSA-2048 public key is 256 bytes. Its signature is 256 bytes.
  • PQC (ML-DSA): An ML-DSA-44 public key is 1,312 bytes. Its signature is 2,420 bytes.   
  • The Multiplier: This represents a roughly 10x increase in signature size. When a payment system processes thousands of transactions per second (TPS), this data bloat accumulates. In Project Leap, this “packet bloat” caused issues with message handling logic, as headers exceeded expected buffer sizes.   

The Latency Gap: Beyond size, the computation required to verify these signatures is heavier. The report highlighted that verification latency increased significantly. In a high-frequency environment like TARGET2, where liquidity is optimized to the microsecond, cumulative latency translates to delayed settlement windows. The study noted that “Post-quantum signature verification took meaningfully longer,” forcing a reassessment of capacity planning.   

Data Integration (from BIS Report): Analysis of the technical annexes reveals the scale of the discrepancy. While RSA verification is near-instantaneous on modern hardware, ML-DSA verification, especially in a hybrid wrapper, introduced measurable delays. The sheer volume of data being hashed and verified caused CPU spikes on the receiving gateways.

Strategic Analysis

This is arguably the most important technical story of the winter. Financial markets run on milliseconds. High-Frequency Trading (HFT) and Real-Time Gross Settlement (RTGS) systems are optimized for speed. The BIS finding that PQC introduces “meaningful” latency transforms PQC from a simple security upgrade into a performance degradation event.   

I wrote a lot in the past about e.g. the the cryptographic stack in payments and about the infrastructure challenges with PQC, so this is very relevant for me.

The Hardware Acceleration Necessity: This report suggests that the financial sector may need to invest in hardware acceleration sooner than expected. Software-based PQC implementation on general-purpose CPUs, as tested in Leap Phase 2, may not be fast enough for the core of the global financial system. We predict a surge in demand for specialized FPGA or ASIC offload cards designed specifically to handle ML-DSA verification at line speed, offloading the CPU. Without this, the G7’s goal of a secure and efficient financial system may be mathematically impossible.   

The Hybrid Trap: The report also exposed the “Hybrid Trap.” While regulators love the idea of hybrid (using both RSA and PQC) for safety, the engineering reality is brutal. It doubles the computation and massively increases the data payload. Project Leap found that TARGET2’s existing design could not “easily accommodate” this model without substantial redevelopment. This serves as a warning to other industries: Hybrid is safer, but it is also much, much heavier.

Quantum Upside & Quantum Risk - Handled

My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.

Talk to me Contact Applied Quantum

Marin

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven consulting firm empowering organizations to seize quantum opportunities and proactively defend against quantum threats. A former quantum entrepreneur, I’ve previously served as a Fortune Global 500 CISO, CTO, Big 4 partner, and leader at Accenture and IBM. Throughout my career, I’ve specialized in managing emerging tech risks, building and leading innovation labs focused on quantum security, AI security, and cyber-kinetic risks for global corporations, governments, and defense agencies. I regularly share insights on quantum technologies and emerging-tech cybersecurity at PostQuantum.com.
Share via
Copy link
Powered by Social Snap