It’s still early in 2025, and every year is designated, often multiple times, as the year of something. So, glass of wine in hand, I sat at my keyboard and asked ChatGPT to finish the line “The year of . . .” It came up with the rather insipid “new beginnings.” I asked it “what else have you got?” and received a list of 10 mostly equally bland replacements. Refining my prompt to “2025 is the year of” only made matters worse, yielding such banalities as “bold moves,” “limitless potential,” “new horizons” and “unforgettable moments.” As I refilled my glass, I thought ChatGPT could have a career writing captions for those trite motivational posters you often see hanging on office walls—or maybe the posters were just part of its training data.
But upon further reflection, some of the words ChatGPT served up—including “resilience,” “transformation,” “reinvention” and “breaking barriers”—might at least be significant, if not hugely creative. In Toronto, as in many places around the world, January 29th marked the celebration of Chinese New Year, and 2025 is the year of the snake. An article in the UK paper The Guardian describes its characteristics. Despite the animal’s reputation as a villain and bringer of bad luck, people born in the year of the snake are said to be resilient, creative, adaptive and able to overcome all kinds of obstacles and challenges. The snake as a symbol represents transformation, fertility and renewal.
Perhaps it’s not a coincidence that 2025 was also designated by UNESCO as the year of quantum science and technology. I’ve always viewed quantum computing as the plucky underdog to the overhyped AI juggernaut, so it shares many of the characteristics that ChatGPT generated and were also listed for the year of the snake. But before we circle back to that point, it’s interesting to note that UNESCO named quantum science and technology, not just quantum computing. There’s a lot more to quantum than qubits. Let’s have a look at the roots of quantum mechanics and revisit quantum computing, and in my next article I’ll do a deeper dive into the rest of quantum technology.
The quantum century
2025 was an appropriate choice, as it marks the 100th anniversary of the publication of Werner Heisenberg’s paper dauntingly titled "On the quantum-theoretical reinterpretation of kinematical and mechanical relationships,” known for short as the “reinterpretation paper.” In it, he laid the groundwork for the matrix notation that would describe the behaviours of subatomic particles. The non-commutative[1] nature of matrix mathematics led to his formulation of the famous uncertainty principle, generally described as the notion that you can’t know two related values about the same particle, like its position and momentum, at once.[2]
Maybe going a bit crazy due to the effects of hay fever and his self-imposed isolation on the island of Helgoland, Heisenberg felt deeply uncomfortable with his discoveries. He corresponded at length with colleagues, including Max Born, Paul Dirac and Wolfgang Pauli, all of whom assured him that he was correct and helped him flesh out the mathematical underpinnings of quantum mechanics.
In the same year, 1925, Erwin Schrödinger approached quantum mechanics from a different perspective. Being familiar with Louis de Broglie’s postulate that all matter vibrates with a particular wavelength[3], Schrödinger devised his famous wave equation to describe the possible behaviours of particles, introducing the now-famous idea of quantum superposition. Just as the vibration of a piano string looks blurry to the naked eye, Schrödinger suggested that the wave description of a subatomic particle implied that you can consider the particle to be in all its possible positions at once. That is, until you observe or measure the particle, at which point the wave function collapses to a single value. You’ve pressed a damper down on the piano string, and the music stops.
De Broglie won the Nobel prize for Physics in 1929, Heisenberg won it in 1932, Schrödinger and Dirac shared it in 1933, and Pauli won it in 1945. Albert Einstein and Niels Bohr had been awarded their Nobels in 1921 and 1922, and their work had laid the foundations of quantum mechanics.
Einstein, in a 1927 series of conversations (or debates) with Bohr, first proposed the phenomenon that would eventually become known as quantum entanglement. The idea was later elaborated by Schrödinger and Grete Hermann, and revisited by Einstein and other colleagues in the mid-1930s. Entanglement means that once two or more quantum particles, or systems of quantum particles, have interacted they can never later be observed independently. If they are separated, they will always behave the same. An observation of one implies an identical observation of the other. Think of entanglement as similar to the stories sometimes told of identical twins, separated at birth and raised independently, yet who end up living remarkably similar lives—having the same mannerisms and tastes in food, marrying spouses with the same name and occupation, etc. The twins haven’t communicated with each other, yet they share all these common characteristics.
Einstein and Schrödinger were both dissatisfied with entanglement, in part because it seemed to violate the theory of relativity, which implied that nothing could exceed the universally constant speed of light. But the whole idea of entanglement is that no information is transmitted between the particles—instead, they share a common quantum state and cannot be considered independently of each other. Experiments have validated that entanglement is real, and it has direct applicability in computing and data security.
Weird as it seems, quantum mechanics has held up for the past century as the pre-eminent explanation of how the world works. It’s been improved and enhanced over time but never disproven. Among the many consequences, three in particular stand out: its implications for computing, data communication and sensor technology.
Counting on quantum
In the early 1980s, Richard Feynman was one of the first physicists to consider the possibility of using quantum superpositioning to define a new way of designing computers. I’ve written a bit about this before, and also discussed the inherently probabilistic nature of quantum computing. It’s worth revisiting this at a slightly deeper level, to wit: exactly how is it that qubits have the potential to deliver so much more computing power?
Remember that binary bits, the foundation of classical computing, have a value of zero or one. Therefore, with one bit, you can store two possible values. With two bits, you can store four. With three bits, eight possible values and with four bits, 16. Your storage capacity is growing exponentially as a power of two, so that with n bits, you can store 2n possible values.
Qubits, at the end of the day, also have a value of zero or one. But part of the magic comes from quantum superpositioning. Remember the analogy of the piano string—it vibrates between two known positions, and this is called the amplitude of its vibration. If you watch the vibrating string, all you can really say is that it has a probability of being observed in some location within its amplitude at any given moment. Similarly, qubits are in a superposition of zero and one, meaning that they carry a probability distribution of being either zero or one. For example, a qubit could have a 25-per-cent probability of being zero and a 75-per-cent probability of being one. The probabilities are actually represented as vectors composed of complex numbers—believe it or not, it makes the mathematics easier—but this allows us to simplify and say that they are zero and one at the same time, or anywhere in between. What it means is that with an arbitrary quantity—say, n qubits—you don’t hold just one of 2n possible values as you would with classical bits; instead, you’re holding all the possible values from 0 to (2n – 1) at once. For example: with eight classical binary bits, you could represent any one value from 0 to 255. With eight qubits, you would represent all those 256 possible values at once.
Getting it (mostly) right
Programming a quantum computer implies influencing the probability values of the qubits, and when the calculation is complete and the qubits are observed, a single value is produced. Schrödinger would say that his wave function has collapsed, or been solved, for a specific value. It’s not a perfect analogy—no analogy is perfect—but it’s kind of like a musician striking piano keys, causing multiple strings to vibrate: at some point they can press the damper pedal and stop all the vibrations, as the final chord fades out.
Now that we’ve seen how quantum superpositioning drives the power of quantum computing, what about entanglement? I’ve written before about how qubits are extremely sensitive to environmental noise, which can cause them to decohere and lose their value. Correcting for these noise-induced errors usually means constructing logical qubits from multiple physical qubits, and much work has been done in the past year to reduce the ratio of physical to logical qubits and thereby improve the efficiency of the quantum computer. Commonly, physical qubits are connected via entanglement to create a logical qubit.
The University of Waterloo has a good article describing how it works. I like their analogy of a book with 100 pages. Classically, if you read one page at a time, each page would give you one per cent of the information from the book. But if the pages were entangled qubits, you would read the entire book cover to cover before you suddenly understand all the information in it. Similarly, in a logical qubit, you don’t just measure one of the entangled qubits—you measure the entire entangled system of qubits to get your answer. And if one of the physical qubits decoheres, it’s a minimal impact on the data in the whole system.
Superposition drives the power of quantum computing, and entanglement provides a path toward fault tolerance in quantum computing. But these powerful phenomena also carry Heisenberg’s uncertainty, making quantum computing an inexact science. The results from executing quantum algorithms are probabilistic, and they generally conform to a typical statistical bell curve. So, quantum programmers usually talk about expectation values—a range of answers that provide a good-enough approximation to solve the problem at hand. Different types of algorithms have different expectation values. Shor’s algorithm, for example, which efficiently finds the prime factors of large integers, rapidly converges to nearly 100-per-cent accuracy the more factors it has to find. Other algorithms like the Variational Quantum Eigensolver (VQE) are effective at yielding better answers to problems in molecular modelling, with applicability in chemistry, biophysics and life sciences.
Proceed with caution
In the final months of 2024, quantum stocks began to surge. I’m not sure why—I’d like to think that investors started to tire of throwing money at anything with AI in its name—but the surge proved to be short-lived. In January, Jensen Huang, CEO of AI chip maker Nvidia, claimed that practical quantum would not be available for another 15 to 30 years, causing major pullbacks in the same quantum equities.[4] In my opinion, neither the positive nor the negative hype was justified. As I’ve written before, quantum computing professionals have a responsibility to manage expectations and reduce hype, lest we fall into the same trap as generative AI—overstatement of the benefits while ignoring the risks.
For example, we’re not well served by the term Quantum Supremacy, nor Google’s claim of having achieved it in late 2024 with their new Willow quantum computer. Willow, we were told, was able to solve a problem in minutes that would take billions of years for the best classical computers. But if we look closer, we find that Google executed something called Random Circuit Sampling, which is a technique for testing the speed of a processor but not necessarily solving a useful problem. Think of it as pressing the accelerator pedal on your Porsche while the transmission is in neutral—your engine revs very fast and produces a lot of noise, but you’re not going anywhere. To Google’s credit, they were able to prove some good results in scaling up error correction in this experiment, and that will prove to be the best outcome—but until it can be applied practically, supremacy or even quantum advantage will have to wait.
Personally, I’m getting a bit tired of the grandiose claims like “quantum computing will solve climate change and world hunger.” I don’t doubt that quantum will eventually be part of the solutions to these and other big problems, but we have to work our way up and there are many other pieces to those puzzles. Quantum supremacy may never exist because we need to recognize quantum computing for what it is—a fascinating and powerful technology that is applicable to a small number of interesting use cases. Don’t get me wrong—where it is applicable, it will be breathtaking—but like any good tool, let’s use it only where and when we need it. IBM has coined the term quantum-centric supercomputing which combines quantum, AI and high-performance computing, and I think that’s heading in the right direction. This is where we will achieve quantum advantage: put the right workload on the right machines, add a layer of coordination and orchestration, and then we will grow up to be able to solve the hardest problems. No one technology will do it alone. When will we get there? I think Jensen Huang was overly pessimistic. IBM has published a timeline to useful quantum computing by the end of the current decade. I think it might be a bit aggressive, but achievable if the company puts all its resources into it.
On beyond computing
Meanwhile, as UNESCO indicated, computing is not all there is to quantum. Quantum computing is often referred to as the second wave of quantum technology. The first wave consists of tools we’re already pretty familiar with: lasers, electron microscopes, solar cells and touch screens are all based on quantum mechanics and proved their utility before we developed the capability to manipulate qubits. The third wave of quantum is primarily defined as quantum communication, but advances in quantum sensors also count. Much of the third wave is still experimental but interesting progress is being made in developing a new quantum internet and highly accurate sensors for autonomous vehicles, for example. I’ll discuss these first and third waves in more detail in my next post.
In 2025, quantum technology exhibits many of the best characteristics of the year of the snake. It’s resilient, taking years to overcome many obstacles like error correction and maintaining coherence. It’s had to adapt itself to developing the right algorithms for the right kinds of problems. And it will be used creatively in ways we can’t imagine yet. Watch a snake, and its movements resemble nothing so much as a wave function, which de Broglie and Schrödinger would appreciate. Quantum technology is progressing the same way—oscillating back and forth but relentlessly moving forward.
[1] Multiplying integers is commutative, which means that the order doesn’t matter: 3x4 = 4x3 = 12, for example. But when you multiply matrices which are two-dimensional tables of values, the order does matter. If A and B are matrices, then A x B is a different result from B x A, so matrix multiplication is non-commutative.
[2] More accurately, the uncertainty principle states that there is a limit to the precision with which you can know both values. The more you know about one, the less you know about the other.
[3] De Broglie’s waves are significant at the subatomic level, but imperceptibly small for anything bigger.
[4] Only a few weeks later, Huang got his comeuppance when Nvidia stock dropped 17 per cent in one day after the surprise release of DeepSeek, China’s new and vastly less expensive generative AI system. But that’s another story for another day.
All quite unbelievable and will certainly leave the average person mystified.
I worry that the world will be controlled by 1% of the population leaving the rest looking at their phones to check out the misinformation that they are being fed.