I used to think that my kids, both solidly of Generation Z, spoke a completely different version of the English language when they were teenagers. The word sick, for example, had nothing to do with disease but instead meant something good or beneficial. Nasty somehow got abbreviated to “narst” or “narse.” And, more to the point of this post, random morphed from meaning unexpected or unpredictable to a general adjective or adverb with a somewhat negative connotation. “Dude, that’s so random!” would indicate that the speaker disapproved of, disagreed with, or at least didn’t fully understand something. Of course, language evolves, and every generation will use this to its own advantage. But maybe the kids were partly right about randomness—it’s a complicated subject and much harder to understand and achieve than you might expect.
It turns out, quantum computing just might help us reclaim the correct definition of random—and fix a serious vulnerability along the way.
Sometimes the numbers run hot
Early in my career, in the late 1980s and early 1990s, I used to teach computer programming classes for IBM. The curriculum was rather specialized, building software applications for IBM’s new PC operating system known as OS/2. There were only very few of us in IBM doing this work, and OS/2 itself eventually gave way to Microsoft Windows, but I have good memories of travelling across North America and around the world, delivering classes and speaking at conferences.
One of the benefits of the job was meeting many people, from all walks of life, who shared an interest in technology and often a background in mathematics or science. Sticking in my mind is a conversation I had with one of my students who told me that when he wasn’t working as a programmer, he would often play the lottery for profit. He explained that “sometimes the numbers run hot” and he could detect a pattern in weekly winning tickets, using that to his advantage.
Of course, being a recent (and cocky) mathematics graduate, I took issue with this notion, pointing out what I thought was the obvious fact that lottery numbers are randomly generated and there couldn’t possibly be a detectable pattern. But he insisted that he had in fact been able to make money using his system, so we eventually ended the conversation with a friendly agreement to disagree.
Perhaps I should not have been so cocky, because had I paid more attention in some of my computer science classes, I would have remembered that computer-generated random numbers are not really random—they are pseudo-random and produced by an algorithm. Therefore, someone with enough time and dedication could possibly reverse-engineer the algorithm and predict the next number. True randomness, it turns out, is almost impossible to find and maybe—just maybe—my former student was on to something after all.
Random types
Random selection, and random numbers, are necessary in many fields and have two primary uses. The first, least strict use is simply to ensure a fair distribution when making selections. For example, if a quantitative analyst in banking is running a Monte Carlo simulation, it matters only that they have a broad range of inputs, not that the input selections could possibly be predicted. The same would go for a scientist selecting candidates for a clinical trial, or a statistician devising a survey. This is what statisticians would call creating a random distribution.
A stricter requirement for randomness in other use cases is random selection, which should be as unpredictable as possible—as in the case for lotteries and other games of chance. But thinking of my former student who believed he could predict lottery numbers, or the many card counters in casinos, algorithmic randomness is not perfectly unpredictable and therefore is always vulnerable to someone with enough computational power and analytical ability.
There’s a good Wikipedia article that describes in detail the many classical approaches to random number generation. Beyond software algorithms, scientists have turned to hardware and physics-based sources of randomness as well—anything from cosmic radiation and radio static to the movement of a hard drive’s read/write head or the timing of a human user’s keystrokes. Even with these, Wikipedia notes that there are asymmetries and measurement issues that affect the true randomness of the outcomes. Moreover, performance is an issue. Even pseudo-random numbers must be generated quickly, so that it would be provably impossible to have generated them via other means. Hardware-based random number generation can be an improvement over software algorithms, but it’s still not perfect.
Personally, I don’t play the lottery and have been to Las Vegas many times for conferences without ever hitting the tables, so I don’t care much if others cheat at gambling. Random numbers, however, are also at the heart of classical cryptographic algorithms that protect sensitive internet traffic and private or classified data. Generating a cryptographic key starts with a random number called a seed, which is fed into the cryptographic algorithm. Once the keys are established, and as long as the private key is not accidentally shared, the data and transactions are secure. Classical pseudo-random number generation, if the algorithm is complex enough, has managed pretty well so far in generating random enough keys.
I suspect, though, that hackers have generally just followed easier methods like phishing e-mails or text messages to breach data systems. However, if a randomly generated seed were reverse-engineered by a hacker familiar with the generation algorithm and in possession of enough patience and computing power, it could lead to data vulnerability without an easy defence. It has also been alleged, though not proven, that several of the many breaches of cryptocurrency exchanges over the years have been due to insufficiently random key generation methods.
God does play dice
Fans of the television series Breaking Bad will recall that the protagonist, Walter White, adopted the pseudonym Heisenberg when he began dealing with drug cartels and other criminal groups. Being a science teacher, White knew that the name implied that his behaviour would be unpredictable, making him all the more dangerous to his enemies. Werner Heisenberg’s Uncertainty Principle, one of the foundational ideas of quantum mechanics, tells us that we can’t know with absolute precision both the location and momentum of subatomic particles. They exist in a range of probabilities, and when measured, they will appear at some truly random location within that range.
Albert Einstein, when he first read Heisenberg’s paper, famously objected to the premise with his remark that “God does not play dice with the universe!” He was deeply uncomfortable with the inherently stochastic nature of quantum mechanics and believed that the cosmos must be operating under some deterministic set of laws, even though we haven’t discovered how those laws work. Niels Bohr, naturally, responded to Einstein: “Stop telling God what to do!”
Einstein did eventually soften his stance, although he never lost his sense of discomfort, and it turned out Bohr and Heisenberg were right. The only source of true randomness in the universe lies deep in the mysteries of quantum mechanics. This, of course, begs the question: can we then use quantum computers to harness quantum randomness and thereby improve the fairness of lotteries and the security of data encryption?
Certifying randomness
A paper recently published in the journal Nature with additional details published on the ArXiv pre-print server, provides a promising answer. The authors, from JP Morgan Chase and quantum hardware manufacturer Quantinuum, as well as several universities, describe a method for certifying the true randomness of bit strings generated by a quantum computer and demonstrate its near-term applicability to solving the vulnerabilities inherent in classical random number generation.
The paper discusses two key techniques in quantum computing, Random Circuit Sampling (RCS) and Cross-Entropy Benchmarking (XEB), and explains how the two can be used together to produce certifiable random numbers. I’ve touched on RCS in a previous post, commenting on Google’s use of it to claim quantum supremacy. RCS by itself doesn’t do very much—it simply makes the quantum computer execute a large number of circuits, allowing engineers to test the speed and fidelity, or accuracy, of the quantum processors. I don’t buy the claim for quantum supremacy on that basis alone, but if RCS is combined with some intricate mathematics and XEB, the resulting random number generator looks like it will point us toward near-term quantum advantage.
So, what is XEB? Entropy, as defined in thermodynamics, is a measurement of the amount of disorder in a physical system. Ice, for example, as a solid, has low entropy—it retains its shape and doesn’t change much. But when heat is applied and ice melts, the resulting water has higher entropy—it is able to flow fairly freely. Apply more heat to evaporate the water, and the resulting steam has higher entropy again—the molecules now float pretty randomly in the air.
For their part, mathematicians and computer scientists have adapted the term entropy as a measurement of how well-organized a data set is. Low entropy implies easily detectable patterns in the data and high entropy implies a random distribution of data. XEB measures the entropy of a data set returned from a quantum computer against a standard of ideal entropy that could theoretically be realized in the absence of quantum noise and errors. Data with an XEB score higher than a given threshold can be considered certifiably random.
The JP Morgan Chase and Quantinuum researchers used a hybrid approach. A classical computer creates a number of quantum circuits, or sets of instructions, and sends them over to the quantum computer for execution. RCS is used to generate the resulting quantum bit strings, which are then tested using XEB. The speed of sampling was also important. Theoretically, a classical computer could be programmed to simulate a quantum computer and execute RCS, but this would necessarily be slower than a real quantum computer. Therefore, the researchers set a maximum allowable response time. The bit strings needed to be returned sufficiently fast that they could only have been generated by a quantum computer and not via classical simulation. Bit strings with a high XEB score and fast response time are certified as random.
A beacon of light
Of course, not everyone has access to a quantum computer of their own, or the skills to program it to generate high-entropy random bit strings. Much better, naturally, for this to be provided as a service. The JP Morgan Chase and Quantinuum authors recommend establishing a quantum randomness beacon that would continuously publish new certifiably random numbers for use by cryptographers, software developers and anyone else who needs access to randomness.
The US National Institute for Standards and Technology (NIST), through its Computer Security Resource Center (CSRC), already operates a random number beacon based on classical, pseudo-random number generators but warns that this should not be used for generation of cryptographic keys. It makes a new random number available every minute. A second version of the beacon is currently being tested and is based on the quantum randomness found in entangled pairs of photons. Either NIST could adopt the methods proposed by JP Morgan Chase and Quantinuum, or any other service provider could also provide a similar quantum randomness beacon.
When quantum randomness is publicly available via beacons, lottery operators will be able to institute fairer games and cryptographers will have access to more reliable key generators. The authors suggest other applications where certifiable randomness will be beneficial. Electronic voting, for example, is dependent on a person’s vote being recorded once and once only, but that any other personally identifiable information not be exposed. E-voting systems therefore usually rely on public-key cryptography, where the vote when recorded is encrypted with a public key and decrypted by election authorities using a private key. If these keys are vulnerable, election results may be compromised and voter records made public. The authors note that keys generated by quantum randomness would be less vulnerable.
Financial markets also rely on randomness. IPOs, for example, are often oversubscribed where there is more demand from investors than shares being offered. In some cases, this is resolved via a lottery system to determine which investors will be allowed to purchase shares. JP Morgan Chase and Quantinuum also note that some exchanges are experimenting with introducing short random delays in trade execution, to deter high-frequency traders from gaining an unfair ability to manipulate markets. In both cases, quantum-generated randomness will be an improvement.
Any process that relies on fairness in its inputs or outputs, or requires privacy and security, will be dependent on certifiable randomness. JP Morgan Chase and Quantinuum have shown that currently available quantum computers are better at producing such certifiable randomness than any known classical methods or algorithms.
Advantage: Quantum.
Wow, what an incredibly interesting article and one that I sorta of understood, lol!
I always enjoy your articles, thanks for taking the time to do this.
Margaret