Quantum Computers, what are they good for?
It has been generally known for some time that quantum computers can in principle solve problems that are intractable on classical computers — like factoring large numbers to break public-key encryption (companies like Google have already increased the key lengths they use, as a precaution against early breakthroughs). With multiple corporations demonstrating working prototype quantum computers, can we look forward to using these amazing machines to accelerate machine learning? Perhaps a Quantum ML accelerator card to compete with GPUs/FPGAs/ASICs?
Quantum computers have other-worldly computational abilities (still mostly theoretical), but also other-worldly limitations. Take something as basic as RAM on a classical computer. We take for granted the ability to copy data to and from RAM and to “copy and paste” data in RAM at will. Quantum computers have something called Q-RAM — Quantum RAM — in which data is stored in ever-so-delicate quantum superpositions (at temperatures very close to absolute zero, so that heat can’t disrupt these states). Unfortunately, the data that is stored with such exquisite care is subject to the ‘no-cloning’ theorem which proves that quantum states cannot be cloned perfectly (the reasons are complex and fascinating, but out of scope for this book). This complicates things quite a bit, not least error correction, which cannot be done using existing methods from classical computers.
Simply loading classical data (bits — ones and zeros from non-quantum computers) into qubits (quantum memory) or extracting the results of quantum operations as classical data, can be so complex and expensive as to destroy whatever speed advantage the quantum computer enjoys in operating on the data! This problem is so pronounced that some researchers suggest not applying quantum computers to classical data at all, focusing instead on modelling quantum mechanical problems — the kind that we touched on when explaining how machine learning is making it possible to efficiently approximate the rules of quantum mechanics on classical hardware, enabling simulations that may well transform materials science.
“The reason most educated people today don’t know simulating quantum systems is important is because classical computers are so bad at it that it’s never been practical to do. We’ve been living too early in history to understand how incredibly important quantum simulation really is.”
Michael Nielsen, “Quantum Computing for the Very Curious”
Despite the serious drawbacks concerning operating on classical data ( do not expect ever to simply substitute quantum hardware for classical) quantum computers still have strengths that suggest they can excel at machine learning due to their affinity with a branch of mathematics called ‘linear algebra’, a key element of most modern machine learning algorithms. Quantum algorithms have been found that offer exponential gains in performance versus their classical counterparts — for these particular tasks, work that takes 1 million hours on a classical machine might be completed in a thousand, even a hundred hours on a quantum system. Algorithms are known that could offer this kind of acceleration for machine learning algorithms ranging from deep learning to recommender systems. However, fully-fledged commercial systems capable of applying even algorithms already devised by theoreticians on datasets sufficiently large to challenge the processing power of classical computers probably need to muster in the order of 1 million qubits — about 20,000 times larger than the prototype quantum computers that have been demonstrated at the time of writing.
It is in representing the states of quantum mechanical systems that quantum computers enjoy their most obvious edge over their classical predecessors (and avoid the difficulties associated with loading and retrieving classical bits). If modelling the quantum mechanical properties of an n atom molecule, a classical computer requires in the order of k^n bits, where k is a number that depends on the number of properties being modelled — let’s take the lower bound value (2) for now. A 2 atom system will need 2² bits of storage — 4 bits doesn’t sound too bad. The trouble is that a 3 atom molecule needs 2³(8 bits), 4 atoms need 2⁴(16 bits), and by the time we reach 64 atoms this doubling has taken us a little past 2 exabytes (a University of Berkeley estimate is that all the data produced by humanity up to the year 1999, including all video, audio and text was about 12 exabytes). At the time of writing, modelling a molecule with 225 atoms using classical memory would require a quantity of flash drives roughly equal to the mass of the observable universe. Even a universe-scale classical computer would be hopelessly inadequate for interesting molecules like proteins, containing thousands of atoms.
What about a quantum computer given the same tasks? It would need k*n bits — so a 2 atom system would need 4 qubits, 3 atoms require 6 qubits, 4 atoms need 16 qubits and 64 atoms require 128 qubits. Instead of doubling our memory requirements, each new atom just adds 2 qubits to the total! Although at the time of writing existing quantum computers remain tiny (less than 100 qubits of total memory), the math tells us that it is a viable path to precise simulations of matter at the finest scale. As I described in “self-driving chemistry”, these simulations should ultimately allow us to mass-produce materials with nanoscale structures, transforming sectors as different as energy and medicine.
Another potential application often raised is in search — a famous (in specialised circles!) method called “Grover’s Algorithm” exists that can greatly reduce the time taken to perform a search — if N items need to be checked, Grover’s Algorithm can do it in N, and so (for large N) should finish ahead of a classical computer (which needs to check all N items — so if N was 1 million, the quantum computer would need to check only 1,000). This is often described in the press as a way of speeding up database queries, but that’s not quite right: databases are almost always designed to use indexing and other techniques to accelerate search (there is also the small matter of converting classical data to qubits that we already discussed). Grover’s Algorithm is very exciting, but databases are an unlikely application: instead, we are more likely to see it being used to solve tough optimisation problems, where the data to be searched are potential solutions to a problem.
Summarizing: it is already clear that machines with enough qubits would be able to take on tasks beyond the reach of conventional computers. Nevertheless, the strengths and weaknesses of the platform are so different from those of the computers we have today that it is very hard to say anything specific about future applications. Despite great progress in terms of algorithmic development, actual hardware remains in its infancy: we haven’t yet reached the quantum equivalent of the moment we reached with the microprocessor, when “Moore’s Law” kicked in. Expect Quantum Computing to be a really, really big deal — just not for some years.
Want to go deeper into this topic? You need to visit Quantum Country.