Monthly Archives: January 2018

What is quantum supremacy?

The race to make faster and faster computers – whether they are designed to play the newest games or predict the weather – has been a cut-throat business for many decades. But there is another computing race that has also been getting more competative in the last few years: The race between quantum computers and the machines they are intended to replace.

Quantum computers work quite differently from the regular computers that power the modern world. Regular computers process and store data as a series of binary bits, which can be either zero or one. On the other hand, quantum computers process data using qubits (quantum bits) that can be zero, one, or any combination of them both. By utilising the immense scope of this additional freedom in how data is encoded, computer scientists have shown that several common computing tasks can be massively speeded up. I wrote about some of the possibilities before, so that post might be a good background.

18-01-14-gates
Operations to carry out modular arithmetic using four qubits. (Source: IBM)

At the moment, performing a particular task using a quantum computer is generally slower than using a regular (or “classical”) one. In fact, some tasks that quantum computers should be very good at are simply too complicated for existing quantum hardware to attempt. But as the technology progresses, eventually quantum machines might be able to out-perform classical ones.

If it happens, that is the moment at which “quantum supremacy” is established.

One factor in determining when quantum supremacy is reached is obviously the performance of quantum computers. More on that later. But their competitors – the classical computers – are also getting faster. Recently, a big step forward in the ability of classical supercomputers to perform tasks that should be well suited to quantum computers was reported by researchers at IBM.

As classical computers get better, the bar for quantum supremacy is being raised.

It is possible to simulate a quantum computer by running a program on a classical computer. The output of the simulated quantum machine should be exactly what an actual quantum device would create. The problem is that the amount of processing power and memory required to do this goes up very quickly as more qubits are simulated. It had been thought that the maximum number of qubits that could be simulated in a classical supercomputer was roughly fifty. After that, it would simply require too much memory.  So quantum supremacy would be established if a quantum computer with 50 working qubits could be made.

What the researchers from IBM have done is to design a program which allows the simulation of 56 qubits. This makes it just that bit harder to get to quantum supremacy!

Intel-49-qubit-chip
Intel’s 49-qubit chip. (Source: Intel)

But what about the other side in the race? The hardware for quantum computing is also getting better, and just this week, Intel announced that it now has a chip that contains 49 qubits. This sounds great, but so far it is quite difficult to assess how good it actually is because a lot of the important data is not available.

The number of qubits is an important indicator of the overall performance of a quantum computer, but there are other very important factors. For instance, qubits have to be linked to each other (or, in the quantum-mechanical language, “entangled”) so that they can share quantum information and carry out the multi-qubit operations that are required to exploit their power. It can be hard to entangle two qubits unless they are close to each other and so, in current devices, often not all the qubits in a chip will be linked. The fewer qubits that each one is linked to, the more inefficient it is to do a calculation, and so this connectivity has a big impact on performance.

Secondly, controlling a qubit is much more difficult than the control of a classical bit. Usually, delicate pulses of microwave radiation are needed to manipulate them, and so they can make errors. Because of this, calculations often have to be repeated several times to make sure that the answer is correct, and not the result of a control error. The higher the error rate, the more times a calculation must be run to be sure that it gives the right answer.

Finally, there is the decoherence time of the qubits. This one is a bit more technical, but the data stored in a qubit can be lost because the outside world impinges on the qubit, destroying the sensitive quantum information. Because of this, the decoherence time limits how long a quantum computer has to complete a calculation: If it can’t finish in time, it might lose the data it is working on. So, if the decoherence time for the qubits is too short, they are next to useless.

And of course, none of these things are problems for simulations using classical computers, because those programs work perfectly!

So far, these numbers are not available for Intel’s new chip. In contrast, IBM have this information freely accessible on github for their machines! Getting this information will be crucial to understanding just how close they are to establishing quantum supremacy.

But for now, the race is well and truly on!


If you want to read a preprint of the paper reporting the 56 qubit simulation, you can find it here.

Also, if you want to learn more about quantum computing, and even run your own programs on a small quantum computer, check out IBM’s public web site. They’ve got a bunch of neat tutorials and a four-qubit machine on their cloud that you can play with.

Advertisements