Google’s D-Wave Quantum Computer Outperforms Traditional PCs by 100 Million Times
Google’s artificial intelligence labs has announced that their controversial D-Wave quantum computer has achieved a new milestone as it solves a “quantum annealing” problem 100 million times faster than a conventional single-core computer.
NASA and Google bought the D-Wave 2X quantum computer back in 2013. The device, which sits at NASA’s Ames Research Center in Mountain View, California, is claimed to be the “world’s first commercial quantum computer.” In theory, D-Wave’s hardware is supposed to be potentially 3,600 times faster than a supercomputer, however up until now the device was never proven to be truly tapped into quantum processing.
So what actually does make quantum computers so much faster is the fact that instead of using bits in 0 or 1 as classical computers do, they use “qubits” that can exist in 0, 1 or a superposition of 2. In turn, that allows them to work possible solutions way faster than a supercomputer.
The team of researchers from Google published results on the arXiv server claiming that they used quantum annealing technique, instead of the conventional simulated annealing, to solve problem instances. Here is Google’s explanation about what happened:
We found that for problem instances involving nearly 1000 binary variables, quantum annealing significantly outperforms its classical counterpart, simulated annealing. It is more than 10^8 times faster than simulated annealing running on a single core. We also compared the quantum hardware to another algorithm called Quantum Monte Carlo. This is a method designed to emulate the behavior of quantum systems, but it runs on conventional processors. While the scaling with size between these two methods is comparable, they are again separated by a large factor sometimes as high as 10^8.
Of course, the announcement is potentially groundbreaking amongst researchers and scientists, but that doesn’t mean quantum computing will make its way to the masses any time soon.
The D-Wave was engineered specifically for the type of problems containing exponential complexity, which NASA and other groups often experience while solving large amounts of data, such as air traffic control data modelling and space missions. Still, it will be interesting to see if they could develop into more generalized computing systems making regular machines faster as they are now.
Gohar is the lead editor at TechFrag. He has a wide range of interests when it comes to tech but he's currently spending a big chunk of his time writing about privacy, cyber security, and anything policy related.