If the quantum computing era dawned 3 years ago, its rising sun may have hidden behind a cloud. In 2019, Google researchers claimed they had passed a milestone known as quantum supremacy when their Sycamore quantum computer performed an abstruse calculation in 200 seconds that they said would tether a supercomputer for 10,000 years. Now scientists in China have done the calculation in a few hours with ordinary processors. A supercomputer, they say, could beat Sycamore outright.
“I think they’re right that if they had access to a big enough supercomputer, they could have simulated the … task in a matter of seconds,” says Scott Aaronson, a computer scientist at the University of Texas, Austin. The advance takes some of the shine off Google’s claim, says Greg Kuperberg, a mathematician at the University of California, Davis. “Getting within 300 feet of the summit is less exciting than reaching the summit.”
Still, the promise of quantum computing remains intact, say Kuperberg and others. And Sergio Boixo, principal scientist at Google Quantum AI, said in an email that the Google team knew their advantage might not hold for long. βIn our 2019 article, we said that classical algorithms would get better,β he said. But, “we don’t think this classical approach can keep pace with quantum circuits in 2022 and beyond.”
The “problem” that Sycamore solved was designed to be difficult for a conventional computer but as easy as possible for a quantum computer, which manipulates qubits that can be set to 0, 1, or, thanks to quantum mechanics, any combination of 0 and 1. at the same time. Together, Sycamore’s 53 qubits, tiny resonant electrical circuits made of superconducting metal, can encode any number from 0 to 2.53 (approximately 9 quadrillion), or even all at once.
Starting with all qubits set to 0, the Google researchers applied to individual qubits and matched a random but fixed set of logic operations, or gates, for 20 cycles, then read the qubits. Generally speaking, the quantum waves representing all possible outputs spread out among the qubits, with the gates creating interference that enhanced some outputs and canceled others. So some should have appeared more likely than others. After millions of tests, a spiked output pattern emerged.
Google researchers argued that simulating such interference effects would overwhelm even Summit, a supercomputer at Oak Ridge National Laboratory, which has 9,216 central processing units and a faster 27,648 graphics processing units (GPUs). Researchers at IBM, which developed Summit, were quick to respond that if they exploited every part of the computer’s available hard drive, it could handle the computation in a few days. Now Pan Zhang, a statistical physicist at the Institute of Theoretical Physics of the Chinese Academy of Sciences, and his colleagues have shown how to beat Sycamore in an article in press at Physical Review Letters.
Following others, Zhang and his colleagues reformulated the problem as a 3D mathematical matrix called a tensor network. It consisted of 20 layers, one for each gate cycle, and each layer consisted of 53 points, one for each qubit. Lines connected the points to represent gates, with each gate encoded in a tensor: a 2D or 4D grid of complex numbers. Running the simulation then boiled down to essentially multiplying all the tensors. “The advantage of the tensor network method is that we can use many GPUs to do the calculations in parallel,” says Zhang.
Zhang and his colleagues also built on a key insight: Sycamore’s calculation was far from exact, so theirs need not be either. Sycamore calculated the distribution of the outputs with an estimated fidelity of 0.2%, enough to distinguish the fingerprint-like spikes from the noise in the circuit. So Zhang’s team traded precision for speed by cutting some lines in their network and removing the corresponding gates. Losing just eight lines made the calculation 256 times faster while maintaining 0.37% fidelity.
The researchers calculated the output pattern for 1 million of the 9 quadrillion possible number strings, relying on their own innovation to obtain a truly random representative set. The computation took 15 hours on 512 GPUs and returned the telltale spiky output. “It’s fair to say that the Google experiment has been simulated on a conventional computer,” says Dominik Hangleiter, a quantum computer scientist at the University of Maryland, College Park. On a supercomputer, the calculation would take a few dozen seconds, Zhang says, 10 billion times faster than the Google team estimated.
The breakthrough underscores the pitfalls of competing with a quantum computer against a conventional one, the researchers say. βThere is an urgent need for better quantum supremacy experiments,β says Aaronson. Zhang suggests a more practical approach: “We should find some real-world applications to demonstrate the quantum advantage.”
Still, Google’s demo wasn’t just hype, the researchers say. Sycamore required far fewer operations and less power than a supercomputer, Zhang notes. And if Sycamore had slightly higher fidelity, he says, the team’s simulation of him might not have kept up. As Hangleiter puts it, “Google’s experiment did what it was supposed to do, start this race.”