Originally published December 5 2023
Frank Robinson might have known a thing or two about physics.
On the warm Sunday afternoon of May 8, 1966, the Baltimore Orioles were hosting Cleveland in an early-season double-header. In the bottom of the first inning of the second game, with one out and one on, Robinson was in the batter’s box facing Cleveland’s ace pitcher, Luis Tiant, who threw a fastball. Within a fraction of a second, Robinson intuitively applied Newton’s first law of motion to understand that the ball’s spin and the air resistance over its seams would cause it to drop into the strike zone low and inside. Adjusting his swing accordingly, he then applied Newton’s second and third laws to launch the ball on a trajectory that would carry it completely out of Baltimore’s Memorial Stadium, rolling to a stop underneath a car in the parking lot 541 feet from home plate. It’s not often a player literally hits one out of the park.
The Orioles would win the double-header, and, eventually, the World Series that year. Robinson went on to a hall of fame career with a .294 batting average, 1,812 RBIs and 586 home runs. He remains to this day the only player ever to be named MVP in both the American League and the National League[1]. And on July 31, 1973, he gave us his famous quote from which I’ve borrowed the title of this post: “Close don’t count in baseball. Close only counts in horseshoes and hand grenades.”
There’s something to that—a pitch is either a ball or a strike, a hit is either fair or foul, a base runner is either safe or out. There’s no ambiguity to it. Isaac Newton would agree. Whether we’re launching baseballs out of the park or rocket ships to the moon, Newton’s laws have enabled us to precisely predict the behaviour of objects in motion on a large scale for hundreds of years.
But Werner Heisenberg might disagree—and he might tell us the old joke about three umpires describing their approach to the game. The first says “I call ‘em as I see ‘em,” to which the second replies, “Well, I call ‘em as they are.” The third points out the fundamental truth of quantum physics: “Boys, they ain’t nothing until I call ‘em!” It turns out, the precision that Frank Robinson treasured about baseball is tempered by the subjectivity of the umpire’s power of observation.
Similarly, in the case of quantum physics, Heisenberg’s Uncertainty Principle undermines the accuracy we’ve come to expect from Newton. And, while we count on classical computers to give us exact solutions to the problems we pose them, we’re finding that—thanks, again, to Heisenberg—quantum computers are only good at being almost right. In mathematical terms, quantum computing is probabilistic instead of deterministic, but this doesn’t mean it isn’t useful. To see how, let’s take a closer look at the uncertainty principle.
Say his name
Heisenberg was an early 20th century pioneer of quantum physics. In the summer of 1925, he isolated himself on the uninhabited island of Helgoland off the German coast and steeped himself in defining the mathematics that would explain the behaviours of electrons, which his mentor, Niels Bohr, had observed in the laboratory. His breakthrough was to use only what could be observed—the frequency and intensity of light emitted when an electron moves between orbits—to define tables of possible locations for the electron. The mathematics worked and its elegance impressed even Albert Einstein, but it came at the cost of no longer being able to describe precise quantities like the position or momentum of any subatomic particles. This is where the Uncertainty Principle and Quantum Superpositioning go hand in hand: the more we know about one quantity the less we know about the other—and until we observe the particles, we can consider them to be simultaneously in all their possible states. They “ain’t nothing”—or they’re always everything—until we “call ‘em.” Heisenberg initially felt deeply alarmed by the ramifications but eventually came to appreciate what he was seeing as the “strangely beautiful interior” of reality.
Think of classical computers as a Newtonian, deterministic approach to computation. Built on bits—binary digits—that can hold a value of zero or one but nothing else, they give us the same precise solutions to the same calculations, every time. But they are approaching limits in their scalability and just won’t be able to handle the enormous computationally intensive problems we’re facing today. In quantum computers, by contrast, we build qubits out of subatomic particles and use Heisenberg’s characteristic of superpositioning to store more data and calculate much faster than classical computers ever could. As a result, quantum computers can take over where classical computers give up—and solve these extremely complicated, mathematical problems. However, to get the benefit of superpositioning we also must deal with uncertainty, and therefore we should not expect 100 per cent accuracy from quantum computers. This is perfectly fine. The tradeoff is that we can tackle the hardest of problems—we just need to accept that our results are approximate, albeit very good.
Seventh inning stretch
In many industrial chemistry and physics applications, lab experimentation is the costliest and most time-consuming approach to solving a problem. Think of building wind tunnels for auto body or airplane fuselage design, or the complexity of testing chemical interactions for new battery technologies. Scientists often turn to computer simulations instead, but classical computers just don’t have the capacity or speed to store all the variables or execute all the necessary calculations. Quantum computers are beginning to be applied to these problems and can deliver simulated experimental results with a reasonably high, but not exact, correlation to experimental results from the lab. Being close appears to be good enough, although not yet commercially viable, and I will cover these scenarios in more detail in future posts. Let’s look now at some more real-world situations.
Take the problem of prime factorization. While a classical computer can correctly factor small integers with ease, the problem gets exponentially harder as the integer gets larger. It would take a classical computer thousands of years to find all the prime factors of a 2,048-bit integer, but Shor’s Algorithm proves that a quantum computer can accomplish this in a matter of minutes. Now it does turn out that you might have to run the algorithm a few thousand times to get—statistically—the right answer, but the quantum speed-up makes the number of iterations worthwhile. The downside in this case, of course, is the eventual destruction of cryptography as we know it—but new quantum-resistant ways of generating and exchanging cryptographic keys are already being developed, so I am optimistic that we will be OK on that front.
Consider optimization problems I’ve discussed before, like the traveling salesperson. For small instances of these problems, the mathematics is sufficiently straightforward that we can rely on classical computing to provide an accurate result. But make the problem just a bit bigger—hundreds or thousands of destinations in the case of the traveling salesperson—and we’re back to years of work for even the most powerful classical computers. Quantum computers can attack the problem differently and deliver an approximate, good enough result many times faster.
Machine Learning, as we saw in my last post, bears a lot of similarities to the mathematics of optimization. Give an ML algorithm a simple task like, say, winning (or tying) a game of tic-tac-toe, and it can give you an accurate solution. But give it a more complicated game like chess, and as likely as not it’s still going to lose to a human player. To be fair, even though chess is a game of perfect information, nobody has yet figured out the absolute best winning strategy because the number of possible combinations of moves is almost uncountably high. We don’t expect perfection from ML in other tasks like handwriting recognition, either—we just hope it can do it better or faster than we can. Even with the improvements in computational speed and efficiency that quantum represents, we still only look for better results than before, not perfect results.
Batter up
There’s an old saying in the IT industry—it’s better to be approximately right than exactly wrong. Taking that a step further, quantum computers can be approximately right where classical computers can barely even start, and I think that’s close enough. Like Heisenberg, I find there’s a certain beauty in it—it’s all about the imperfection and the challenge to do better. In the words of the American novelist Christopher Morley: “For after all, happiness (as the mathematicians might say) lies on a curve, and we approach it only by asymptote.”[2]
So, let’s update Frank Robinson’s words of wisdom: “Close only counts in horseshoes, hand grenades—and quantum computing.”
[1] As of the date of publication. In November 2024, Shohei Ohtani of the Los Angeles Dodgers joined Robinson in this achievement.
[2] Morley, Christopher: The Haunted Bookshop. 1955. The J. B. Lippincott Company.