The promise of quantum computers is that certain computational tasks might be executed exponentially faster on a quantum processor than on a classical processor.
Quantum computing is a beautiful fusion of quantum physics with computer science. It incorporates some of the most stunning ideas of physics from the twentieth century into an entirely new way of thinking about computation. Quantum computers have the potential to resolve problems of a high complexity and magnitude across many different industries and application, including finance, transportation, chemicals, and cybersecurity. Solving the impossible in a few hours of computing time.
Quantum computing is often in the news: China teleported a qubit from earth to a satellite; Shor’s algorithm has put our current encryption methods at risk; quantum key distribution will make encryption safe again; Grover’s algorithm will speed up data searches. But what does all this really mean? How does it all work?
Today’s computers operate in a very straightforward fashion: they manipulate a limited set of data with an algorithm and give you an answer. Quantum computers are more complicated. After multiple units of data are input into qubits, the qubits are manipulated to interact with other qubits, allowing for several calculations to be done simultaneously. That’s where quantum computers are a lot faster than today’s machines.
Quantum computers have four fundamental capabilities that differentiate them from today’s classical computers:
- Quantum Simulation, in which quantum computers model complex molecules;
- Optimization (that is, solving multivariable problems with unprecedented speed);
- Quantum artificial intelligence (AI), with better algorithms that could transform machine learning across industries; and
- Prime factorization, which could revolutionize encryption.
From Bit to Qubit
All computations involve inputting data, manipulating it according to certain rules, and then outputting the final answer. For classical computations, the bit is the basic unit of data. For quantum computation, this unit is the quantum bit – usually shortened to qubit.
The basic unit of quantum computing is a qubit. A classical bit is either 0 or 1. If it’s 0 and we measure it, we get 0. If it’s 1 and we measure 1, we get 1. In both cases the bit remains unchanged. The standard example is an electrical switch that can be either on or off. The situation is totally different for qubits. Qubits are volatile. A qubit can be in one of an infinite number of states – a superposition of both 0 and 1 – but when we measure it, as in the classical case, we just get one of two values, either 0 or 1. Qubits can also become entangled. In fact, the act of measurement changes the qubit. When we make a measurement of one of them, it affects the state of the other. What’s more, they interact with other qubits. In fact, these interactions are what make it possible to conduct multiple calculations at once.
Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as “spooky action at a distance.” But it’s key to the power of quantum computers. In a conventional computer, doubling the number of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.
These three things – superposition, measurement, and entanglement – are the key quantum mechanical ideas. Controlling these interactions, however, is very complicated. The volatility of qubits can cause inputs to be lost or altered, which can throw off the accuracy of results. And creating a computer of meaningful scale would require hundreds of thousands of millions of qubits to be connected coherently. The few quantum computers that exist today can handle nowhere near that number. But the good news is we’re getting very, very close.
Quantum computing and classical computer are not two distinct disciplines. Quantum computing is the more fundamental form of computing – anything that can be computed classically can be computed on a quantum computer. The qubit is the basic unit of computation, not the bit. Computation, in its essence, really means quantum computing. A qubit can be represented by the spin of an electron or the polarization of a photon.
Quantum Supremacy and Parallel Universes
In 2019 Google achieved a level of quantum supremacy when they reported the use of a processor with programmable superconducting qubits to create quantum states on 54 qubits, corresponding to a computational state-space of dimension 253 (about 1016). This incredible achievement was slightly short of their mission goal for creating quantum states of 72 qubits. What is so special about this number? Classical computers can simulate quantum computers if the quantum computer doesn’t have too many qubits, but as the number of qubits increases we reach the point where that is no longer possible.
There are 8 possible three-bit combinations: 000,001, 010, 011, 100, 101, 110, 111. The number 8 comes from 23. There are two choices for the first bit, two for the second and two for the third, and we might multiple these three 2s together. If instead of bits we switch to qubits, each of these 8 three-bit strings is associated with a basis vector, so the vector space is 8-dimensional. If we have 72 qubits, the number of basis elements is 2. This is about 4,000,000,000,000,000,000,000. It is a large number and is considered to be the point at which classical computers cannot simulate quantum computers. Once quantum computers have more than 72 or so qubits we truly enter the age of quantum supremacy – when quantum computers can do computations that are beyond the ability of any classical computer.
To provide a little more perspective, let’s consider a machine with 300 qubits. This doesn’t seem an unreasonable number of the not too distant future. But 2300 is an enormous number. It’s more than the number of elementary particles in the known universe. A computation using 300 qubits would be working with 2300 basis elements.
Industry Specific Application
Some calculations required for the effective simulation of real-life scenarios are simply beyond the capability of classical computers – what’s known as intractable problems. Quantum computers, with their huge computational power, are ideally suited to solving these problems. Indeed, some problems, like factoring, are “hard” on a classical computer, but are “easy” on a quantum computer. This creates a world of opportunities, across almost every aspect of modern life.
Healthcare: classical computers are limited in terms of size and complexity of molecules they can simulate and compare (an essential process of early drug development). Quantum computers will allow much larger molecules to be simulated. At the same time, researchers will be able to model and simulate interactions between drugs and all 20,000+ proteins encoded in the human genome, leading to greater advancements in pharmacology.
Finance: one potential application is algorithmic trading – using complex algorithms to automatically trigger share dealings based on a wide variety of market variables. The advantages, especially for high-volume transactions, are significant. Another application is fraud detection. Like diagnostics in healthcare, fraud detection is reliant upon pattern recognition. Quantum computers could deliver a significant improvement in machine learning capabilities; dramatically reducing the time taken to train a neural network and improving the detection rate.
Logistics: Improved data analysis and modelling will enable a wide range of industries to optimize workflows associated with transport, logistics and supply-chain management. The calculation and recalculation of optimal routes could impact on applications as diverse as traffic management, fleet operations, air traffic control, freight and distribution.
The Road Ahead
It is, of course, impossible to predict the long-term impact of quantum computing with any accuracy. Quantum computing is now in its infancy, and the comparison to the first computers seems apt. The machines that have been constructed so far tend to be large and not very powerful, and they often involve superconductors that need cooled to extremely low temperatures. To minimize the interaction of quantum computers with the environment, they are always protected from light and heat. They are shieled against electromagnetic radiation, and they are cooled. One thing that can happen in cold places is that certain materials become superconductors – they lose all electrical resistance – and superconductors have quantum properties that can be exploited.
Many countries are experimenting with small quantum networks using optic fiber. There is the potential of connecting these via satellite and being able to form a worldwide quantum network. This work is of great interest to financial institutions. One early impressive result involves a Chinese satellite that is devoted to quantum experiments. It’s named Micius after a Chinese philosopher who did work in optics. A team in China connected to a team in Austria – the first time that intercontinental quantum key distribution (QKD) had been achieved. Once the connection was secured, the teams sent pictures to one another. The Chinese team sent the Austrians a picture of Micius, and the Austrians sent a picture of Schrodinger to the Chinese.
To actually make practical quantum computers you need to solve a number of problems, the most serious being decoherence – the problem of your qubit interacting with something from the environment that is not part of the computation. You need to set a qubit to an initial state and keep it in that state until you need to use it. Their quantum state is extremely fragile. The slightest vibration or change in temperature – disturbances known as “noise” in quantum-speak – can cause them to tumble out of superposition before their job has been properly done. That’s why researchers are doing the best to protect qubits from the outside world in supercooled fridges and vacuum chambers.
Alan Turing is one of the fathers of the theory of computation. In his landmark paper of 1936 he carefully thought about computation. He considered what humans did as they performed computations and broke it down to its most elemental level. He showed that a simple theoretical machine, which we now call a Turing machine, could carry out any algorithm. But remember, Turing was analyzing computation based on what humans do. With quantum computation the focus changes from how humans compute to how the universe computes. Therefore, we should think of quantum computation as not a new type of computation but as the discovery of the true nature of computation.