All Post-Quantum, PQC Posts
-
Post-Quantum, PQC, Quantum Security
Introduction to Quantum Random Number Generation (QRNG)
Cryptographic systems rely on the unpredictability and randomness of numbers to secure data. In cryptography, the strength of encryption keys depends on their unpredictability. Unpredictable and truly random numbers—those that remain secure even against extensive computational resources and are completely unknown to adversaries—are among the most essential elements in cryptography and cybersecurity.
Read More » -
Post-Quantum, PQC, Quantum Security
Sign Today, Forge Tomorrow (STFT) or Trust Now, Forge Later (TNFL) Risk
Sign Today, Forge Tomorrow (STFT) or Trust Now, Forge Later (TNFL) is the digital‑signature equivalent of HNDL. Digital signatures underpin everything from software updates and firmware integrity to identity verification and supply‑chain provenance. Today’s signatures are based on RSA or ECDSA, which quantum computers will also break. When that happens, adversaries won’t just read secrets - they will forge signatures at will. The term Sign-Today-Forge-Tomorrow…
Read More » -
Post-Quantum, PQC, Quantum Security
The CRQC Quantum Capability Framework
This guide is a detailed, end‑to‑end map for understanding what it will actually take to reach a cryptographically relevant quantum computer (CRQC), i.e. break RSA-2048 - not just headline qubit counts. A CRQC must meet two conditions: the algorithmic requirements of the target attack and the hardware capabilities needed to execute it fault-tolerantly. The CRQC Quantum Capability Framework organizes these hardware capabilities into nine interdependent…
Read More » -
Post-Quantum, PQC, Quantum Security
The Challenge of IT and OT Asset Discovery
Every CISO understands the simple truth: you can’t protect what you don’t know you have. A comprehensive inventory of IT and OT assets - from servers and laptops to industrial controllers and IoT sensors - is the foundation of effective cybersecurity. In theory, building this asset inventory sounds straightforward. In practice, it’s one of the hardest tasks in cybersecurity today. Many enterprises find that even…
Read More » -
Post-Quantum, PQC, Quantum Security
Brassard–Høyer–Tapp (BHT) Quantum Collision Algorithm and Post-Quantum Security
The Brassard–Høyer–Tapp (BHT) algorithm is a quantum algorithm discovered in 1997 that finds collisions in hash functions faster than classical methods. In cryptography, a collision means finding two different inputs that produce the same hash output, undermining the hash’s collision resistance. The BHT algorithm theoretically reduces the time complexity of finding collisions from the classical birthday-paradox bound of about O(2n/2) (for an n-bit hash) down…
Read More » -
Post-Quantum, PQC, Quantum Security
Capability D.3: Continuous Operation (Long-Duration Stability)
One of the most critical requirements for a cryptographically relevant quantum computer (CRQC) is continuous operation - the ability to run a complex quantum algorithm non-stop for an extended period (on the order of days) without losing quantum coherence or needing a reset. In practical terms, the entire quantum computing stack - qubits, control electronics, error-correction processes, cooling systems - must sustain stable performance for…
Read More » -
Post-Quantum, PQC, Quantum Security
Capability D.1: Full Fault-Tolerant Algorithm Integration
Imagine a quantum computer that can execute an entire algorithm start-to-finish with errors actively corrected throughout. Full fault-tolerant algorithm integration is exactly that: the orchestration of all components - stable logical qubits, high-fidelity gates, error-correction cycles, ancilla factories, measurements, and real-time feedback - to run a useful quantum algorithm reliably from beginning to end. This capability is essentially the “system integration” of quantum computing, bringing…
Read More » -
Post-Quantum, PQC, Quantum Security
Capability D.2: Decoder Performance (Real‑Time Error Correction Processing)
In a fault-tolerant quantum computer, qubits are continuously monitored via stabilizer measurements (producing “syndrome” bits) to detect errors. The decoder is a classical algorithm (running on specialized hardware) that takes this rapid stream of syndrome data and figures out which qubits have experienced errors, so that corrections can be applied immediately. Achieving real-time decoding at scale is enormously challenging: a CRQC may need to handle…
Read More »