BY MAIMUN MUSTAFA
Long gone are the days of simple binary operations as quantum computing is pushing the boundaries of technology as we know it. Contrary to what we have maybe seen in some Ant-man and Avenger Movies, progress in quantum computing by Google has ushered a brave new world as we head into the 4th industrial revolution.
The basic principle of quantum computation is that quantum properties can be used to represent and structure data and that quantum mechanisms can be devised and built to perform operations with this data. While classical computers are based on bits whereas, a quantum computer is based on quantum bits which are better known as qubits. These qubits are physically derived from small quantum objects on an atomic level such as electrons or photons, where a pure quantum mechanical state such as the spin indicates the ones and zeros, which is how processing is done in traditional computers. As a result, the impact of the processing power has gone up multifold.
Contemporary Quantum Computing
A quantum computer is any device for computation that makes direct use of distinctively quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. Although quantum computing is still in its infancy, experiments have been carried out in which quantum computational operations were executed on a very small number of qubits. Research in both theoretical and applied areas continues at a fast pace, and this attracts funding from the national government and military agencies to support quantum computing research. This technology has developed quantum computers for both civilian and national security purposes, such as cryptanalysis. Large-scale quantum computers may be able to solve certain problems exponentially faster than any of our classical computers.
Bringing in the Edge
Quantum computers are different from other computers such as DNA computers and traditional computers based on transistors. Some computing architectures such as optical computers may use classical superposition of electromagnetic waves, but without some specifically quantum mechanical resources such as entanglement, they have less potential for computational speed-up than quantum computers.
Quantum Threats to Cyber Security
The faster approach of a quantum computer would allow it to break many of the cryptographic systems in use today. In particular, most of the popular public-key cyphers are based on the difficulty of factoring integers, including forms of RSA.
These are used to protect secure web pages, encrypted email, and many other types of data. Breaking these would have significant ramifications for electronic privacy and security. The only way to increase the security of an algorithm like RSA would be to increase the key size and hope that an adversary does not have the resources to build and use a powerful enough quantum computer. It seems plausible that it will always be possible to build classical computers that have more bits than the number of qubits in the largest quantum computer.
The Google Quantum Effort
Google AI has recently launched a research effort that aims to build quantum processors and develop novel quantum algorithms to dramatically accelerate computational tasks for machine learning and, by doing so, just took a quantum leap in computer science. Using the company’s state-of-the-art quantum computer, called Sycamore, Google has claimed a form of “quantum supremacy” over the most powerful supercomputers in the world by solving a problem considered virtually impossible for normal machines. Google’s new machine completed the complex computation in 200 seconds which would have would be taken even the most powerful supercomputers approximately 10,000 years to finish the calculation.
The team of researchers, led by John Martinis, an experimental physicist at the University of California, Santa Barbara, wrote in their published research, “It is likely that the classical simulation time, currently estimated at 10,000 years, will be reduced by improved classical hardware and algorithms,” Brooks Foxen, a graduate student researcher in Martinis’ lab, said in a statement. “But since we are currently 1.5 trillion times faster, we feel comfortable laying claim to this achievement,” he added, referring to the supremacy of quantum computers. Quantum computers can take advantage of the whacky physics of quantum mechanics to solve problems that would be extremely difficult, if not impossible, for classical, semiconductor-based computers to solve. The calculation that Google chose to conquer is the quantum equivalent of generating a very long list of random numbers and checking their values a million times over. The result is a solution not particularly useful outside of the world of quantum mechanics, but it has big implications for the processing power of a device. Ordinary computers perform calculations using “bits” of information, which, like on-and-off switches, can exist in only two states: either 1 or 0. Quantum computers use quantum bits, or “qubits,” which can exist as both 1 and 0 simultaneously. Google’s quantum computer consists of microscopic circuits of superconducting metal that entangle 53 qubits in a complex superposition state. The entangled qubits generate a random number between zero and 253, but due to quantum interference, some random numbers show up more than others. When the computer measures these random numbers millions of times, a pattern arises from their uneven distribution. “For classical computers, it is much more difficult to compute the outcome of these operations, because it requires computing the probability of being in any one of the 253 possible states, where the 53 comes from the number of qubits — the exponential scaling [of states] is why people are interested in quantum computing, to begin with,” Foxen said. Taking advantage of the strange properties of quantum entanglement and superposition, Martinis’ lab-produced this distribution pattern using the Sycamore chip in 200 seconds. On paper, it’s easy to show why a quantum computer could outperform traditional computers. Demonstrating the task in the real world is another story. Whereas classical computers can stack millions of operating bits in their processors, quantum computers struggle to scale the number of qubits they can operate with. Entangled qubits become untangled after short periods and are susceptible to noise and errors. Although this Google achievement is certainly a feat in the world of quantum computing, the field is still in its infancy and practical quantum computers remain far on the horizon, the researchers said.
Quantum Computing and Neven’s Law
In December 2018, scientists at Google AI ran a calculation on Google’s best quantum processor. They were able to reproduce the computation using a regular laptop. Then in January, they ran the same test on an improved version of the quantum chip. This time they had to use a powerful desktop computer to simulate the result. By February, there were no longer any classical computers in the building that could simulate their quantum counterparts. The researchers had to request time on Google’s enormous server network to do that. “Somewhere in February I had to make calls to say, ‘Hey, we need more quota,’” said Hartmut Neven, the director of the Quantum Artificial Intelligence Lab. “We were running jobs comprised of a million processors.” That rapid improvement has led to what’s being called “Neven’s law,” a new kind of rule to describe how quickly quantum computers are gaining on classical ones. The rule began as an in-house observation before Neven mentioned it in May at the Google Quantum Spring Symposium. There, he said that quantum computers are gaining computational power relative to classical ones at a “double exponential” rate—a staggeringly fast clip.
With double exponential growth, “it looks like nothing is happening, nothing is happening, and then whoops, suddenly you’re in a different world,” Neven said. “That’s what we’re experiencing here.” Even exponential growth is pretty fast. It means that some quantity grows by powers of 2: The first few increases might not be that noticeable, but subsequent jumps are massive. Moore’s law, the famous guideline stating (roughly) that computing power doubles every two years, is exponential. Doubly exponential growth is far more dramatic. Instead of increasing by powers of 2, quantities grow by powers of 2: Doubly exponential growth featured in the recent Quanta story “Computer Scientists Expand the Frontiers of Verifiable Knowledge,” where it described the extreme rate at which certain computational problems increase in complexity. Doubly exponential growth is so singular that it’s hard to find examples of it in the real world. The rate of progress in quantum computing may be the first.
The doubly exponential rate at which, according to Neven, quantum computers are gaining on classical ones is a result of two exponential factors combined with each other. The first is that quantum computers have an intrinsic exponential advantage over classical ones: If a quantum circuit has four quantum bits, for example, it takes a classical circuit with 16 ordinary bits to achieve equivalent computational power. This would be true even if quantum technology never improved. The second exponential factor comes from the rapid improvement of quantum processors. Neven says that Google’s best quantum chips have recently been improving at an exponential rate. (This rapid improvement has been driven by a reduction in the error rate in the quantum circuits. Reducing the error rate has allowed the engineers to build larger quantum processors, Neven said.) If classical computers require exponentially more computational power to simulate quantum processors, and those quantum processors are growing exponentially more powerful with time, you end up with this doubly exponential relationship between quantum and classical machines. Not everyone is convinced by this. For one thing, classical computers are not standing still. Ordinary computer chips continue to improve, even if Moore’s law may be ending. In addition, computer scientists constantly devise more efficient algorithms that help classical computers keep pace. “When looking at all the moving parts, including improvements on the classical and quantum sides, it’s hard for me to say it’s doubly exponential,” said Andrew Childs, the co-director of the Joint Center for Quantum Information and Computer Science at the University of Maryland. While the exact rate at which quantum computers are closing in on classical ones might be debatable, there’s no doubt quantum technology is improving, and fast.
“I think the undeniable reality of this progress puts the ball firmly in the court of those who believe scalable quantum computing can’t work,” wrote Scott Aaronson, a computer scientist at the University of Texas, Austin, in an email. “They’re the ones who need to articulate where and why the progress will stop.”
A paramount goal in the field of quantum computing is to perform an efficient quantum calculation that cannot be simulated in any reasonable amount of time on even the most powerful classical computer (currently the Summit supercomputer at Oak Ridge National Laboratory). Among the different research groups developing quantum computers, Google has been particularly vocal about its pursuit of this milestone, known as “quantum supremacy.” So far, quantum supremacy has proved elusive—sometimes seemingly around the corner, but never yet at hand. But if Neven’s law holds, it can’t be far away. Neven wouldn’t say exactly when he anticipates the Google team will achieve quantum supremacy, but he allowed that it could happen soon. “We often say we think we will achieve it in 2019,” Neven said. “The writing is on the wall.” Reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences. (Hartnett, A New “Law” Suggests Quantum Supremacy Could Happen This Year, 2019)
Quantum Leap Forward
Predicting the future is always difficult, but it can be attempted when the product of interest is an insight into current devices that do not span too many orders of magnitude. However, to create a quantum computer that can run complex program like Shor’s algorithm and find the private key in a 1024-bit RSA encrypted message requires building a machine that is more than five orders of magnitude larger and has error rates that are about two orders of magnitude better than current machines in addition to developing the software development environment to support this machine.
The progress required to bridge this gap makes it impossible to project the time frame for a large error-corrected quantum computer, and while significant progress in these areas is commendable, there is no guarantee that all of these challenges will be overcome. The process of bridging this gap might expose unanticipated challenges, require techniques that are not yet invented, or shift owing to new results of foundational scientific research that change our understanding of the quantum world.
Given the unique characteristics and challenges of quantum computers, they are unlikely to be useful as a direct replacement of traditional computers. In fact, they require a number of classical computers to control their operations and carry out computations needed to implement quantum error correction. Thus, they are currently being designed as special-purpose devices operating in a complementary fashion with classical processors, analogous to a co-processor or an accelerator.
In rapidly advancing fields, where there are many unknowns and hard problems, the rate of overall development is set by the ability of the whole community to take advantage of new approaches and insights. Fields where research results are kept secret or proprietary progress much more slowly. Fortunately, many quantum computing researchers have been open about sharing advances to date, and the field will benefit greatly by continuing with this philosophy.
It is also clear that a technology’s progress depends on the resources, both human and capital, devoted to it. Improved technology may generate exponentially increasing revenue, enabling reinvestment in research and development (R&D) and attracting new talent and industries to help innovate and scale the technology to the next level. As with silicon technology, sustained exponential growth for qubits requires an exponentially growing investment, sustaining this investment will likely require a similar virtuous cycle for quantum computers, where smaller machines are commercially successful enough to grow investment in the overall area. In the absence of intermediate success yielding commercial revenue, progress will depend on governmental agencies continuing to increase funding of this effort. Even in this scenario, successful completion of intermediate milestones is likely to be essential.
Going forward with the 4IR
Among the most immediate and profitable uses for quantum computers will be optimization. Ride-sharing apps, like Uber, will be able to locate the fastest route to pick up and drop off as many customers as possible whereas e-commerce giants, like Amazon, the most cost-effective way to deliver billions of packages during the holiday gift-buying rush can be found. These simple questions involve number crunching hundreds to thousands of variables at once, a feat that modern supercomputers just can’t handle. So for a world that is becoming increasingly reliant on data and artificial intelligence, quantum computing may be a Godsend. Similar to the point above, the reason why the weather channel sometimes gets it wrong is that there are too many environmental variables for their supercomputers to process (that and sometimes poor weather data collection). But with a quantum computer, weather scientists can not only forecast near-term weather patterns perfectly, but they can also create more accurate long-term climate assessments to predict the effects of climate change. Personalized medicine. Decoding the DNA and the unique microbiome is crucial for future doctors to prescribe drugs that are perfectly tailored to the body. While traditional supercomputers have made strides in decoding DNA cost-effectively, the microbiome is far beyond their reach—but not so for future quantum computers.
Quantum computers will also allow pharmaceutical companies to better predict how different molecules react with their drugs, thereby significantly speeding up pharmaceutical development and lowering prices. The space telescopes of today (and tomorrow) collect enormous amounts of astrological imagery data each day that tracks the movements of trillions of galaxies, stars, planets, and asteroids. Sadly, this is far too much data for today’s supercomputers to sift through to make meaningful discoveries on a regular basis. However, a mature quantum computer combined with machine-learning, all this data can finally be processed efficiently, opening the door to the discovery of hundreds to thousands of new planets daily by the early-2030s. Similar to the points above, the raw computing power these quantum computers enable will allow scientists and engineers to devise new chemicals and materials, as well as better functioning engines and of course, cooler Christmas toys. Using traditional computers, machine-learning algorithms need a giant amount of curated and labelled examples (big data) to learn new skills. With quantum computing, machine-learning software can begin to learn more like humans, whereby they can pick up new skills using less data, messier data, often with few instructions. This application is also a topic of excitement among researchers in the artificial intelligence (AI) field, as this improved natural learning capacity could accelerate progress in AI research by decades. More on this in our Future of Artificial Intelligence series. Sadly, this is the application that has most researchers and intelligence agencies nervous. All current encryption services depend on creating passwords that would take a modern supercomputer thousands of years to crack; quantum computers could theoretically rip through these encryption keys in under an hour. Banking, communication, national security services, the internet itself depends on reliable encryption to function. (Oh, and forget about the bitcoin as well, given its core dependence on encryption.) If these quantum computers work as advertised, all of these industries will be at risk, at worst endangering the entire world economy until we build quantum encryption to keep pace. Quantum computers will also enable near-perfect, real-time language translation between any two languages, either over a Skype chat or through the use of an audio wearable or implant in your ear. In 20 years, the language will no longer be a barrier to business and everyday interactions. For example, a person who only speaks English can more confidently enter into business relationships with partners in foreign countries where English brands would have otherwise failed to penetrate, and when visiting said foreign countries, this person may even fall in love with a certain somebody who only happens to speak Cantonese.
From time travel to romance, quantum computing surely holds much drama for the future of technology and the universe as we know. The next question to ask is, ‘Ok Google, what’s next for quantum computing’.
About the Quantum Realm
Physicist Richard Feynman, who was involved deeply in the development of the first atomic bomb, proposed significant theories of quantum electrodynamics, a realm concerned with the way in which electrons interact with one another through the electromagnetic force propagated through the photon. Creating the Nobel-winning, simple visuals of the possible interactions between an electron and photon and other atomic interactions, Feynman also predicted that antiparticles (particles which possess a charge opposite to that of their mirror particle) are actually just normal particles which move backwards in time.
Feynman, among others, began to investigate the generalization of conventional information science concepts to quantum physical processes, considering the representation of binary numbers in relation to the quantum states of two-state quantum systems: in simple words, by simulating quantum systems not with conventional computers but with other quantum systems constructed for this purpose.
David Deutsch, of Oxford University, published a theoretical paper describing a universal quantum computer, proving that if a two-state system could be made to evolve by means of a set of simple operations, any such evolution could be produced, and made to simulate any physical system. These operations came to be called quantum ‘gates’, as they function similarly to binary logic gates in classical computers.
Peter Shor, working for AT&T, proposed a method using entanglement of qubits and superposition to find the prime factors of an integer, a rather valuable process as many encryption systems exploit the difficulty in finding factors of large numbers. In principle, his algorithm would far surpass the efficiency of any known computer when executed on a quantum computer. Shor’s discovery proved vital in stimulating research by physicists and computer scientists on the issue.
The National Institute of Standards and Technology and the California Institute of Technology jointly contemplated the problem of shielding a quantum system from environmental influences and performed experiments with magnetic fields, which allow particles (ions) to be trapped and cooled to a quantum state. This method, however, allowed only devices of a few bits to be created, ones which lose coherence rapidly.
A team of researchers from the University of California at Berkeley, MIT, Harvard University, and IBM pursued a similar technique but using nuclear magnetic resonance (NMR), a technology that seemed to manipulate quantum information in liquids. They attempted to ameliorate the threat of DE coherence by working with a vast number of quantum computers, allowing each qubit to be represented by many molecules, thus decreasing the effect of external forces. By varying the electromagnetic field used, certain oscillations are found which allow certain spins to flip between these states. Also, the constant motion of molecules in liquids create interactions allowing the construction of logic gates through NMR The team develops a 2-bit quantum computer made from a thimble of chloroform; input consists of radiofrequency pulses into the liquid containing. The algorithm runs through the quantum computer is one devised by Lov Grover of Bell Laboratories. Grover’s quantum algorithm is O (N1/2). With the quantum computer developed, a list of four items was subjected to this algorithm, which proved to be able to find the desired item in a single step.
In 1998, the feasibility of quantum teleportation is proposed by an international team of researchers, who based their conclusions on a theorem of quantum mechanics called the Einstein-Podolsky-Rosen effect. The group theorizes that two entangled, “transporter” particles introduced to a third, “message” particle might transfer properties from one to the other. The idea is put into practice six years later, by researchers at the University of Innsbruck in Austria. Two pairs of entangled photons were exposed to each other, and it is revealed that the polarization state of one may be transferred to the other. The discovery has implications for data transfer and networking among quantum particles in quantum computing.
*The writer is a serial entrepreneur in the fields of innovative technology, business facilitation and communication. He can be reached at firstname.lastname@example.org