Post-Quantum Cryptography

A complete guide to the cryptographic algorithms being built to withstand quantum computers — and why the transition has already begun.

What Is Post-Quantum Cryptography?

Post-quantum cryptography (PQC) refers to cryptographic algorithms designed to be secure against attacks by both classical and quantum computers. Unlike the RSA and elliptic curve cryptography (ECC) that currently protect most of the internet, PQC algorithms are built on mathematical problems that quantum computers cannot efficiently solve.

The term “post-quantum” does not mean these algorithms require quantum hardware to run. They are conventional algorithms — executable on today’s classical computers — whose security properties are expected to hold even when large-scale quantum computers become available. This distinction matters: PQC is not quantum cryptography (which uses quantum mechanical properties like entanglement for key distribution), but rather classical cryptography hardened against a quantum adversary.

The urgency behind PQC stems from Shor’s algorithm, published by mathematician Peter Shor in 1994. Shor’s algorithm demonstrates that a sufficiently powerful quantum computer can factor large integers and compute discrete logarithms in polynomial time — operations that underpin RSA, Diffie-Hellman, and ECC. When (not if) such a machine is built, every system relying on these algorithms becomes vulnerable. The cryptographic community’s response is PQC: a new generation of algorithms grounded in different hard problems.

Why It Matters Now

A common misconception holds that PQC can wait until quantum computers are close to breaking current encryption. This overlooks two critical realities.

First, the “harvest now, decrypt later” threat is already active. Nation-state adversaries are collecting encrypted communications today with the intent of decrypting them once quantum computers become capable. Sensitive data with long confidentiality requirements — diplomatic communications, health records, financial data, intellectual property, classified intelligence — is at risk right now. The U.S. National Security Agency (NSA) has explicitly warned about this threat.

Second, cryptographic migrations are slow. The transition from SHA-1 to SHA-256 took over a decade. Moving from DES to AES required years of standardization, implementation, testing, and deployment. The PQC transition is larger in scope because it affects both encryption and digital signatures — the two pillars of internet security. Every TLS connection, every signed software update, every authenticated API call must eventually migrate.

The combination of an active “harvest now” threat and a multi-year migration timeline means the window for action is now. Organizations that wait for a cryptographically relevant quantum computer (CRQC) to appear before beginning migration will be years too late.

NIST Post-Quantum Standards

The U.S. National Institute of Standards and Technology (NIST) began its Post-Quantum Cryptography Standardization Process in 2016, evaluating 82 initial submissions across multiple rounds. In August 2024, NIST published its first three finalized PQC standards:

ML-KEM (FIPS 203) — Key Encapsulation

ML-KEM (Module-Lattice-Based Key-Encapsulation Mechanism), derived from the CRYSTALS-Kyber submission, is NIST’s primary standard for key exchange. It replaces the Diffie-Hellman and ECDH key agreement protocols used in TLS and other transport security. ML-KEM operates on the hardness of the Module Learning With Errors (MLWE) problem — a structured variant of a lattice problem with no known quantum speedup. It offers three security levels: ML-KEM-512, ML-KEM-768, and ML-KEM-1024, corresponding to approximately 128, 192, and 256 bits of classical security.

ML-DSA (FIPS 204) — Digital Signatures

ML-DSA (Module-Lattice-Based Digital Signature Algorithm), derived from CRYSTALS-Dilithium, is the primary standard for digital signatures. It replaces RSA and ECDSA signatures in code signing, certificate authorities, document authentication, and blockchain protocols. Like ML-KEM, it is built on module lattice problems. ML-DSA provides three parameter sets (ML-DSA-44, ML-DSA-65, ML-DSA-87) at increasing security levels.

SLH-DSA (FIPS 205) — Stateless Hash-Based Signatures

SLH-DSA (Stateless Hash-Based Digital Signature Algorithm), derived from SPHINCS+, provides a conservative alternative to ML-DSA. Its security rests solely on the properties of hash functions — arguably the most studied and trusted primitive in cryptography. SLH-DSA signatures are larger and slower than ML-DSA, but it serves as a hedge: if a breakthrough were to weaken lattice-based assumptions, SLH-DSA would remain secure. NIST recommends it particularly for applications where long-term signature verification is critical.

Additional Candidates Under Evaluation

NIST has not stopped at three algorithms. Additional signature schemes including FALCON (now FN-DSA, based on NTRU lattices) are still being standardized. NIST has also solicited new signature proposals to diversify the portfolio, particularly schemes based on code-based, isogeny-based, and multivariate assumptions. The goal is a broad toolkit so that a break in one mathematical family does not compromise the entire standard.

Lattice-Based Cryptography

Lattice-based cryptography is the dominant family in PQC, underpinning both ML-KEM and ML-DSA. A lattice, in this context, is a regular grid of points in high-dimensional space. The fundamental hard problems are:

  • Shortest Vector Problem (SVP): Given a lattice, find the shortest non-zero vector. In high dimensions, this is believed to be intractable for both classical and quantum computers.
  • Learning With Errors (LWE): Given a system of approximate linear equations over a finite field, recover the secret. The “errors” (small noise terms) make the problem hard even for quantum algorithms.
  • Module-LWE and Ring-LWE: Structured variants of LWE that enable more efficient constructions with smaller key sizes, used in the NIST standards.

The appeal of lattice problems for PQC is threefold. First, no quantum algorithm provides a significant advantage over classical algorithms for solving them — unlike factoring or discrete logarithms. Second, they support a rich variety of cryptographic constructions including encryption, signatures, and even fully homomorphic encryption. Third, lattice-based schemes tend to have good performance characteristics: ML-KEM key generation and encapsulation are faster than their RSA equivalents, though key and ciphertext sizes are larger.

The main concern is that lattice problems are less studied than factoring. The cryptographic community has analyzed RSA for nearly fifty years; lattice-based assumptions have received serious attention for roughly two decades. While no practical attacks have emerged, the shorter track record is why NIST has standardized SLH-DSA as a hash-based backup.

Hash-Based Signatures

Hash-based signature schemes derive their security entirely from the properties of cryptographic hash functions — specifically collision resistance and preimage resistance. Because Grover’s algorithm provides only a quadratic speedup for searching (not an exponential one like Shor’s for factoring), hash functions with sufficiently large output remain secure against quantum computers. Doubling the hash output size compensates for Grover’s advantage.

The foundational construction is the Merkle signature scheme (1979), which uses a binary tree of one-time signature (OTS) keys. Each leaf is a one-time signing key; the root serves as the public key. To sign, the signer uses one OTS key and provides an authentication path (Merkle proof) from the leaf to the root.

SPHINCS+ (now SLH-DSA) extends this idea into a hypertree of Merkle trees, eliminating the statefulness problem that makes classical Merkle signatures impractical for general use. In a stateful scheme, the signer must track which OTS keys have been used — reusing a key is catastrophic. SLH-DSA avoids this by using a pseudo-random function to deterministically select a leaf, making the scheme stateless at the cost of larger signatures (typically 7–49 KB depending on the parameter set).

NIST has also standardized stateful hash-based schemes — XMSS (RFC 8391) and LMS (RFC 8554) — for specific applications like firmware signing where state management is feasible and signature compactness matters. These produce signatures of around 2.5 KB but require careful key management infrastructure.

Code-Based Cryptography

Code-based cryptography builds on the difficulty of decoding random linear codes — a problem that has resisted algorithmic advances for over forty years. The McEliece cryptosystem, proposed in 1978, is the oldest PQC candidate still considered secure. It was submitted to NIST’s PQC process as Classic McEliece and advanced to the fourth round of evaluation.

The core idea: the public key is a “scrambled” version of an error-correcting code (specifically, a binary Goppa code). Encryption adds random errors to a codeword; only the holder of the private key (the unscrambled code) can correct the errors and recover the message. The security assumption is that distinguishing a Goppa code from a random code, and decoding a general linear code, are both computationally hard problems.

The primary drawback of Classic McEliece is key size. Public keys range from roughly 260 KB to over 1 MB depending on the security level — orders of magnitude larger than lattice-based schemes. This makes it impractical for TLS handshakes and most interactive protocols, but potentially suitable for applications where keys can be pre-distributed or stored in advance.

Code-based cryptography’s strength is its maturity. The underlying problems have been studied since the 1960s, and the McEliece system has withstood over four decades of cryptanalysis — a longer track record than any lattice-based scheme. This makes it a valuable diversification strategy for the post-quantum toolkit.

Timeline & Migration

The PQC migration is already underway. Key milestones:

  • 2024: NIST publishes FIPS 203, 204, and 205 — the first finalized PQC standards. Google Chrome and Cloudflare deploy ML-KEM in hybrid mode for TLS.
  • 2025: Apple integrates PQ3 (a hybrid post-quantum protocol) into iMessage. Signal deploys PQXDH for post-quantum key agreement. NSA mandates PQC adoption timelines for national security systems.
  • 2025–2030: NIST’s recommended migration window. Organizations should inventory cryptographic dependencies, prioritize high-value assets, and begin deploying hybrid (classical + PQC) solutions.
  • 2030–2035: Expected deprecation of classical-only cryptography for government and critical infrastructure. Full PQC deployment targets for most sectors.

The recommended migration strategy is hybrid deployment: running both a classical algorithm (e.g., ECDH) and a post-quantum algorithm (e.g., ML-KEM) simultaneously. This ensures that security is maintained even if the PQC algorithm turns out to have an unforeseen weakness, while also protecting against quantum attacks that would break the classical component alone.

The practical challenge is scale. Every library, every protocol implementation, every hardware security module, and every certificate authority must be updated. Organizations should begin with a cryptographic inventory — cataloging where and how they use public-key cryptography — and then prioritize migration based on data sensitivity and system exposure.

Further Reading

  • NIST Post-Quantum Cryptography Standardization — csrc.nist.gov/projects/post-quantum-cryptography
  • FIPS 203, 204, 205 — The finalized NIST PQC standards documents
  • “Post-Quantum Cryptography: Current State and Quantum Mitigation” — CISA guidance for critical infrastructure operators
  • NSA Cybersecurity Advisory: Commercial National Security Algorithm Suite 2.0 (CNSA 2.0) — Migration timelines for national security systems
  • Daniel J. Bernstein and Tanja Lange, “Post-quantum cryptography” — Foundational academic survey

Latest Dispatches