As I write this, the 2024 Paris summer Olympic games have just concluded. Being an avid runner, I was of course keenly interested in the marathon, the 10,000M, the 5,000M and the many other track events. There was no shortage of drama and thrills at all distances, and I applaud the strength, determination, patience and fierce competitive spirit displayed by all the athletes.
I wonder what goes through an Olympic competitor’s mind as they’re waiting at the start line. Time must stretch to a seeming eternity, and when the official recites that short phrase before firing the starting pistol every nerve, sinew and muscle is tensed to the point of explosion. Timing is critical—you want to burst off the blocks at the sound of the gun, but the embarrassment of a false start could lead to the ultimate failure of disqualification. As a middle-of-the-pack amateur, it’s a moment of stress and excitement that I can only imagine.
Early in 2023 I began writing about the looming threat that quantum computing poses to cryptography as we know it today and I likened the path to Post-Quantum Cryptography (PQC) to a marathon. I was not alone in discussing this topic; many others have covered it before and after me—but much has changed in the intervening months and as of August 13, 2024 I can say with confidence that the race is officially on. The course distance may not be as long as I once thought it was and the starting pistol has been fired by none other than NIST, the National Institute for Standards and Technology. If we’re not feeling that Olympic level of start-line stress, we should be. My friends, it’s time to start running. The finish line of PQC is out there. In case you think you don’t need to run this race, let’s look at how we got here.
Before the start line
It all began 30 years ago, in 1994, when the American mathematician Peter Shor devised a quantum algorithm that could break down large integers into their prime factors. Prime factorization is notoriously hard to do on classical computers; the time needed to solve such problems grows exponentially with the size of the integer. Shor’s quantum approach reduced it to what’s known as a polynomial-time problem—so that the time to solution doesn’t grow nearly as fast. It’s interesting to note that Shor’s algorithm is in fact hybrid, where classical computing methods are used to divide the problem into smaller pieces which are individually solved using quantum.
Prime factorization is of course the foundation of the cryptographic keys that secure online communications. RSA-2048, the most popular cryptographic protocol for internet traffic, is based on the difficulty (estimated at thousands of years or more for classical high-performance computers) of factoring integers 2048 bits long, or 22048—in other words, integers with 617 decimal digits. If Shor’s algorithm works and can accomplish this in minutes on a sufficiently large quantum computer, then we have a big problem.
Two questions come to mind. First, how big a quantum computer will be needed, and how long will it take for one to be invented? And second, assuming that the answer to the first question is not unrealistic, what can we do to fix the problem?
The mathematics behind the mathematics
We can derive the answer to the first question from the mathematics underlying Shor’s algorithm, combined with the quantum technology roadmap published by IBM. I won’t reproduce all of it—involving how the large integer gets broken down into smaller parts for factorization—you can find it all on Wikipedia if you want. But the key piece I gleaned from it is this: for the integer you want to factor, call it N, for Shor’s algorithm to work, you also need to find another integer at least as large as N, which is a power of 2—expressed as 2n for some value of n that you will have to find. Shor’s algorithm uses two quantum circuits (sets of instructions) where one does the calculation, and the other governs the accuracy of the results. (Remember that quantum computing is inherently probabilistic.) The calculating circuit will require n qubits, and the governing circuit will require 2n+1 qubits. For RSA-2048 encryption, we’re given the value of n—it’s 2,048. So, we’ll need a quantum computer with two registers—one of at least 4,097 qubits and the other of 2,048 qubits—that can all function coherently without being affected by noise.
How long will it take to get such a cryptographically relevant quantum computer? Well, in 2016 NIST estimated it could happen by about the year 2030—and IBM’s current quantum roadmap as of December 2023 indicates that multiple thousands of logical qubits will be feasible sometime shortly after the year 2033. But beware—IBM and other technology companies are making great strides in developing logical qubits and scaling their quantum hardware, so I would expect to see this timeframe get shorter, not longer.
Assuming that we have less than a decade to defend ourselves against a quantum attack on cryptography, let’s look at the second question—what will it take for cryptography to be quantum-resistant? The answer to this one lies in NIST’s announcement from August 13, and the many years of work that led up to this point.
The new mathematics
After the publication of Shor’s algorithm, its impact on cryptography remained mostly an academic question for several years. Universities tinkered with small quantum computers of only a handful of qubits and were able to successfully factor integers such as 15 (3x5) or 21 (3x7) about half the time or a little more.[1] By 2016, as quantum computing began improving quickly, NIST started to take the problem seriously. A few years later, NIST invited businesses, universities, and anyone else who was interested to submit new algorithms that could be tested for their resilience against quantum attacks. Dozens of algorithms were submitted and several rounds of selection happened over the ensuing years, resulting in four finalists being announced in 2023. Of these, one more dropped out, and on August 13, 2024, NIST published the three algorithms that will form its PQC standard.
These algorithms carry the science fiction sounding names of CRYSTALS-Dilithium[2], CRYSTALS-KYBER and SPHINCS+. They are approved for the three major forms of cryptography—asymmetric encryption used for protecting data communication, symmetric encryption to protect data at rest, and digital signatures. The first two were developed and submitted by IBM, who is quick to point out that the author of the third has subsequently joined their quantum team.
What makes the three algorithms resistant to quantum attacks is that they no longer rely on prime factorization. The CRYSTALS algorithms are based on the rather obscure pure mathematical constructions known as lattices. To simplify a bit, think of a lattice as a very complicated type of matrix that has its elements ordered by some defined mathematical relationship. Just like with matrices, they can start as a one-dimensional line or two-dimensional grid but can quickly grow into as many dimensions as you care to imagine—with their complexity growing exponentially as a result. There is no equivalent to Shor’s algorithm for solving problems on lattices[3] so even quantum computers are expected to take thousands of years to break lattice-based encryption.
The SPHINCS+ algorithm is based on a different mathematical structure known as a hash function which also has exponential growth in complexity. Hash functions have been around mathematics and computer science for a long time already. They have the remarkable feature of being able to take multiple data inputs of any size and convert them all to encrypted outputs that are always of equal length. They’re most useful for verifying the authenticity of the original—any slight change to the original, when hashed, yields a totally different output. Most important for quantum resilience is that hash functions are one-way—they’re easy to perform but virtually impossible to reverse-engineer. Given a hash output, even a cryptographically relevant quantum computer can’t work backwards to recreate the original data.
Finish strong
Some companies have been confident enough in these algorithms’ proof against quantum attacks, that they have begun commercializing them even before NIST’s August 13, 2024 announcement. IBM embedded the CRYSTALS algorithms into its Z16 family of mainframes in mid-2023. Apple announced that its iMessage communications protocol would be quantum-resistant in early 2024. Network equipment manufacturers like Crypto4A and Thales have also already built the CRYSTALS algorithms into their HSMs (hardware security modules) for a year or more.
For most, the burning question should be where to start. Many of us have been promoting the idea of doing a quantum risk assessment for a long time now. The Canadian company EvolutionQ has been offering these for several years; IBM and many other systems integrators have similar offerings. These assessments could have been started and even completed before now because they entail doing an inventory of all cryptographic routines embedded in your software and networks, creating a cryptographic bill of materials (CBOM) and doing a risk assessment of your data and vulnerabilities. Another Canadian company, Infosec Global, as well as IBM and others are also offering software tools that can automatically scan your code, identify the cryptographic vulnerabilities and even begin to remediate them. Implementing PQC, of course, can now take place after NIST’s announcement.
Ultimately you want to achieve crypto agility. If you can extract your cryptography from all your source code, and isolate it as separate callable modules, it will be much easier to replace with quantum resistant algorithms—and keep it continuously updated as the industry evolves. On the other hand, you may also be using mostly off-the-shelf software from commercial vendors, or custom-built code written by other service providers. You will have to work with those vendors to ensure that they have realistic plans in place to achieve and maintain quantum resilience for you and all their other clients.
The US government has already passed legislation mandating that federal agencies migrate to PQC. Canadian and European governments have issued guidelines, but I am sure that legislation will follow there too. The threat is real, but the solution is now available and the process is known.
Go!
[1] This may seem laughably trivial, but one of the interesting features of Shor’s algorithm is that it gets better with more prime factors. Mathematically, the probability of success is expressed as 1- (1/2k) where k is the number of prime factors you’re looking for. So for such very small integers, a proof published in Science Direct suggests that the probability is at least 50% and, if I understand it correctly, it should converge to 75%. But for very large integers that also have many prime factors, so that the value of k is high, the probability of success quickly approaches 100%.
[2] Star Trek fans will recognize “dilithium crystals” as an extremely rare substance critical for the regulation of matter-antimatter reactions in a star ship’s warp drive. The crystals form a lattice structure in more than four dimensions, thus possibly resembling the mathematical lattices underlying the new PQC algorithms. Never underestimate the charming nerdiness of mathematicians.
[3] At least, not yet. One of my colleagues adamantly opposes the term quantum-safe on the reasonable grounds that we just don’t know if a new quantum algorithm might be devised in the future which could break the CRYSTALS or SPHINCS+ implementations. I would tend to agree, and I think PQC is a solid intermediate but necessary step toward the ultimate goal of a whole new cryptography based on quantum communication principles.