(Physics.aps.org) Katie McCormick, a science writer, explains research indicating a new classical algorithm reduces—by a factor of one billion—a recent claim of so-called quantum advantage. Inside Quantum Technology News summarizes:
Those developing quantum algorithms are pitted against each other and against those working on classical algorithms in the race to achieve the coveted “advantage” of a quantum computer.
Jacob Bulmer of the University of Bristol, UK, Bryn Bell of Imperial College London, and colleagues have significantly slashed the advantage of a recently demonstrated quantum-computing algorithm over its classical counterpart using a method called Gaussian boson sampling. The team behind that advantage claim had asserted that a classical computation of Gaussian boson sampling would take 600 million years on the world’s fastest supercomputer. But Bulmer, Bell, and colleagues show that their classical algorithm can do it in just 73 days.
Gaussian boson sampling is an adaptation of a 2011 idea from Scott Aaronson of the University of Texas at Austin and Alex Arkhipov, who, at the time, was at the Massachusetts Institute of Technology. The idea, known as boson sampling, proposed sending a beam of single photons through a network of beam splitters to create a complex web of correlations between the paths of the photons.
Bulmer says that the team knew that one of the main bottlenecks in the classical calculation was determining the “loop Hafnian,” a matrix function that is at the heart of simulating Gaussian-boson-sampling experiments. This function gives the probability of measuring a particular distribution of photons at the end of the experiment. The function is inherently difficult to calculate classically, which gives Gaussian boson samplers their advantage over classical computers. Bulmer, Bell, and their colleagues found that they could improve the calculation time by taking advantage of patterns in the structure of the matrix that mathematically describe how photons travel through the maze of beam splitters. This change, along with some other improvements and simplifications, allowed the team to reduce the estimated simulation time of the USTC experiment to just 73 days.
In the distant future, most researchers expect that quantum computers will outperform classical ones by such a large margin that nobody could possibly doubt that they are better. Aaronson has the same hope, but in the meantime, he thinks that classical computers “can, at least for a while, fight back.” He says, “developments like these send a message that the experimenters need to up their game if they want [a] quantum [advantage]…to be maintained and improved into the future.”