“The existing encryption standards that underpin just about every online exchange of information are a bit of gnarly mathematics designed to be well-nigh impossible for today’s computers to crack without just the right arithmetical key. But NIST’s scientists have not been pondering today’s machines. They worry about a coming era of quantum computers.
These exploit the weirdness of the quantum world to perform calculations in fundamentally different ways from those used by conventional computers. This confers an enormous theoretical advantage in a small number of problem types—including identifying a large number’s prime factors (numbers, divisible only by themselves and one, that can be multiplied together to obtain the number in question) and computing the properties of points on functions called elliptic curves.
Both are used widely in cryptography. RSA, an algorithm based on factorisation, is employed alongside elliptic-curve cryptography in most internet connections, and in virtual private networks, messaging services including WhatsApp and Signal, and the anonymising web browser Tor. Yet both would crumble against a sufficiently advanced quantum computer running Shor’s algorithm, developed in 1995 by Peter Shor, an American mathematician. [..]
A survey of experts, conducted in 2021, found a majority believed that by 2036, RSA-2048, an existing industry-standard encryption protocol that makes use of keys 2,048 bits long, could be broken within 24 hours. [..]
Though little of today’s internet chatter is likely to interest a hacker from, say, 2040, plenty of data—medical records, national-security communications or technical details of long-lived infrastructure—might retain their value until then. And data sent around willy-nilly today, on an assumption of impregnability, need not be strategically relevant to hackers for them to pose an embarrassment or risk to the businesses or officials who were doing the sending. [..]
One path forward would be to use quantum-powered defences against a quantum-powered attack, deploying what is known as quantum-key distribution. That, though, requires expensive kit and dedicated connections. Governments and large companies might manage that, but smaller fry would find it hard.
A more promising approach would be to identify new classes of mathematical problems that even quantum machines would struggle to crack. This was NIST’s task. In 2016 it launched a competition to find worthy algorithms for “post-quantum cryptography” (PQC), receiving 82 submissions from 25 countries. After three rounds of sifting and valiant searches for vulnerabilities by independent cryptographers, four winning techniques and four backup approaches have emerged.
The winners were all developed by consortia of academic and commercial researchers and all, as you might expect, involve melon-twisting mathematics best left to the experts. One called Kyber, the brainchild of a group called crystals (Cryptographic Suite for Algebraic Lattices; the name refers to abstruse groupings in number theory), is for general encryption. The remaining three propose digital signatures, which will allow senders to verify their identity reliably. Two also use lattices: crystals-Dilithium and falcon (Fast Fourier Lattice-based Compact Signatures over NTRU, an acronym that allegedly stands for Number Theorists “R” Us). [..]
Coding the winning algorithms into practical software is expected to take until 2024, according to Dustin Moody, who led the competition. Brian LaMacchia, head of the security and cryptography team at Microsoft Research, worries that people will not start the PQC transition early enough. The White House, though, issued a national-security memorandum in May, telling federal agencies to gird for a cryptographic transition and directing NIST to encourage the private sector to do likewise. [..]
Peter Schwabe, a cryptographer at Radboud University in the Netherlands who has been involved in developing both Kyber and sphincs+, therefore reckons the future lies in a hybrid approach. Rather than jettisoning existing encryption, he proposes retaining the algorithms currently in use and adding another post-quantum level [..].”