Quantum Computing has been a concept since the 1980s that has remained outside the domain of real-world high performance computing. Through the era of Moore’s Law and exponential progress in feature size, clock rates and resulting performance, the need for alternative paradigms and technologies has attracted little attention or interest. But there has remained a curiosity among a limited community, primarily but not only in academia, that has driven slow but persistent advances in associated ideas including theory, technologies, algorithms and, most recently, commercially available proof-of-concept systems.
The driving motivation is that, at least in principle, some problems could be solved by a quantum computer that could never in practical terms be performed by a conventional supercomputer in a human lifetime. The poster child for this is factoring of products of prime numbers for cryptology. But there are many obstacles between this extraordinary ambition and the reality of today’s technological capabilities; obstacles that have persistently deferred interest in what has been perceived as unlikely speculation. Nonetheless, progress through incremental steps by the international research community has reversed the conventional viewpoint. The quantum community can no longer be ignored. Although still at its most inchoate phase, early results are beginning to suggest that quantum computers may one day be a real contribution to the domain of high performance computing; maybe.
Of course, those of us with a practical Newtonian upbringing; enhanced with a bit of Einstein equations (special relativity can be derived with high school algebra and trigonometry); we all quote that E = mc2, understand the many von Neumann derivatives of Vector, SIMD, and MPP (including my personal favorite: Beowulf Linux clusters) comprising the last four decades of supercomputer history. Quantum computing is something completely different. Based on quantum mechanics, quantum computing is as weird compared to conventional practices as quantum mechanics is from Newtonian physics. Trying to explain quantum computing is like teaching computer science at Hogwarts. But the remarkable properties of quantum computing are due to the equally counter-intuitive mechanisms of quantum mechanics. Among these are superposition and entanglement; just terms to most of us, but central to the unique behaviors possible.
Superposition is the ability to store many different states simultaneously in a single set of memory units. Conventionally, the memory units are bits. Each bit stores either a 1 or 0 (if you didn’t know that, you probably shouldn’t be reading this article). For a handful of bits, the numeric range increases by 2n for n bits. But it is still a single value. Superposition allows multiple values to be stored in a single number of quantum computer bits know as “qubits.” Fancy diagrams of spheres with vectors from the center pointing somewhere out to the surface are used to illustrate the idea that multiple values are stored at the same time in a set of qubits. The number of such values is exponential with the number of qubits. Yes, you should already be confused. But the key concept is that quantum computers should not only store this multiplicity of values but be able to process them simultaneously as well. Thus a quantum computer, at least in principle, is an extraordinary parallel computer, but not by doing the operations separately; rather doing them in the same logic at the same time.
There is an important, actually dominant, caveat. You can’t “look” at value(s) in a qubit without the many-value value collapsing to a singular value. Yes, the act of looking actually affects the state of the quantum machine. This is where the property of “entanglement” comes in. If superposition appears very strange, this is nothing compared to entanglement in quantum mechanics. If two particles are directly associated (like near each other), they can be coupled such that, if they are then separated, they are still related. In fact, if you look at one of the two particles, you know what the state of the other distant particle is. And, if that is not weird enough, this is done instantaneously; not just very fast but truly instant. Yes, it appears to defy the speed of light. Albert Einstein rejected the idea back in the 1930s referring to it as “spooky action at a distance.” His reaction was entirely reasonable, and he was among the earliest to truly to understand this implication. But the father of “relativity” was wrong. All tests performed today to prove or disprove this theory has demonstrated its reality. One of the consequences of entanglement is that, again in principle, viewing one entangled particle allows the other pair-wise particle to be read without disturbing the multi quantum state of its partnering particle.
Another problem imposes a barrier between theory and realistic implementation; quantum noise. Keeping qubits stable is extremely hard, perhaps impossible for an indefinite period. To understate the case, operation of quantum devices is statistical. Getting the right answer is not guaranteed, and there is a probability associated with each system’s computation dependent on the sensitivity to sources of noise. One can imagine making many runs of the same problem and generating a probability distribution function of answers. Much of the engineering going into the realization of a conceptual quantum computer is dedicated to resolving or at least mitigating the problem of noise. These challenges have been largely understood for the last three decades, and yet, quantum computing has been ignored by the HPC community, and for good reason — there was essentially no reason to believe they would ever become reality, at least in the lifetime of current practitioners. That may have changed.
Like conventional digital computing, many different technologies are considered as possible enablers for future quantum computers. Experiments by a number of research institutions have explored this space and are responsible for slowly emerging confidence in the possible likelihood of alternative technologies. In 2008, a new company, D-wave, was established in Canada to create the world’s first commercial quantum computers. Based on earlier technology prototypes, D-wave announced its first commercial offering in 2011, D-Wave One, incorporating 128 qubits and in 2013 a 512 qubit system, D-Wave Two. These systems employ a technology based on superconducting integrated circuits with pair-wise flux qubits. Although superconducting circuits require temperatures of about 4 Kelvins to work, to minimize the quantum noise, the D-Wave machines maintain a thermal environment more on the order of 20 milli-Kelvins. Even with this success, there is a debate as to whether or not it is a “real” quantum computer (whatever that means). Rather, it is an adiabatic annealing machine emphasizing optimization problems. The question of actual quantum entanglement is still unresolved and, only recently, has there been indirect evidence that such phenomena is actually happening. Nonetheless, there is real interest now among such giants as Lockheed, Google, and others even in these early systems as proof of concept. The engineering achievements so far are very impressive.
Demonstration of a number of algorithms has already been achieved on these and other laboratory examples of quantum processing. The field is wide open and the future while uncertain (the centerpiece of the quantum computing technology itself) is very promising. HPC can no longer ignore, nor should it want to, then paradigm shift that may provide part of the solution to addressing the end of Moore’s Law.
Dr. Thomas Sterling holds the position of Professor of Informatics and Computing at the Indiana University (IU) School of Informatics and Computing, as well as serves as Chief Scientist and Executive Associate Director of the Center for Research in Extreme Scale Technologies (CREST).
This article first appeared on TOP500.org.