D-Wave Systems — one of the few companies currently developing viable quantum computers — announced details of its new 2,000-qubit processer Wednesday. The company claimed that the new system — built using its largest-ever quantum computing chip — is up to a 1,000 times faster than its current 1,000-qubit version, D-Wave 2X.
In a statement, D-Wave said that the new system would enable larger problems to be solved, and even allow users to fine tune the computational process to solve problems faster. This, the company added, would extend D-Wave’s “significant lead over all quantum computing competitors.”
“As the only company to have developed and commercialized a scalable quantum computer, we’re continuing our record of rapid increases in the power of our systems, now up to 2,000 qubits. Our growing user base provides real world experience that helps us design features and capabilities that provide quantifiable benefits,” Jeremy Hilton, D-Wave’s senior vice president of systems, said. “A good example of this is giving users the ability to tune the quantum algorithm to improve application performance.”
Qubits — or quantum bits — are quantum computing’s equivalent of the bits — 0 and 1 — used in conventional computers. Unlike the conventional bits, which can exist only in one of the two states, qubits can exist in a state of superposition. This phenomenon, which allows qubits to exist in both states at the same time, coupled with quantum entanglement — wherein subatomic particles are physically separate but act as if they are connected — is what gives quantum computers a significant advantage over conventional computers.
Creation of viable and large-scale universal quantum computer is a goal many researchers and companies, including Google and IBM, are currently pursuing. However, many have questioned whether D-Wave’s machines, which utilize a concept called quantum annealing, can really be classified as “quantum” computers.
“Just because [their chips] are quantum, that doesn’t make them a quantum computer,” Greg Kuperberg, a mathematician at the University of California, Davis, told the Verge. “That's like saying that any invention that is influenced by air must be an airplane. Of course, it's not true; it might instead be bagpipes.”
Quantum annealing involves turning a problem into a topographic landscape of crests and troughs, with the lowest point being the best solution. This works well for optimization problems, where the goal is to find the best among several possible solutions, but is far from the Holy Grail of quantum computers researchers in the field are working toward — a machine that can carry out immensely complicated operations, such as implementing Shor’s algorithm, which can make today’s encryption systems obsolete.
Moreover, according to a 2014 study published in the journal Science, the D-Wave Two — the company’s second commercially viable quantum computer — was no faster than conventional systems.
“There was only ever a hope that a quantum annealer would be better,” co-author Matthias Troyer, told the Verge. “It turns out that at least for the architecture implemented by D-Wave, [the computation] can be mimicked very efficiently on a classical computer. ... We don’t have any evidence of quantum speedup in this architecture and building a bigger machine will not help that.”
D-Wave, on its part, says that it will stick with quantum annealers to make its quantum computers, and has argued that other methods are still theoretical.
“Our focus is on delivering quantum technology for customers in the real world,” D-Wave CEO Vern Brownell, said in the statement. “As we scale our processors, we’re adding features and capabilities that give users new ways to solve problems. These new features can enable machine learning applications that we believe are not available on classical systems.”