(Spectrum.IEEE) IBM’s concept of quantum volume tries to measure quantum computing progress in ways beyond counting qubits.
Measuring the progress of quantum computers can prove tricky in the era of “noisy” quantum computing technology. One concept, known as “quantum volume,” has become a favored measure among companies such as IBM and Honeywell. But not every company or researcher agrees on its usefulness as a yardstick in quantum computing.
Think of quantum volume as the average of worst-case circuits run on any quantum computer,” says Jay Gambetta, a research fellow and vice president in quantum computing at IBM. “The result means that if this ‘worst case’ is possible, quantum volume is a measure of the circuits’ quality; the higher the quality, the more complex circuits can be run on a quantum computer.”
More specifically, IBM’s team defines quantum volume as 2 to the power of the size of the largest circuit with equal width and depth that can pass a certain reliability test involving random two-qubit gates, says Daniel Lidar, director of the Center for Quantum Information Science and Technology at the University of Southern California in Los Angeles.
Since IBM began publicizing the term more starting in late 2019, quantum volume has come up a number of times in the quantum computing papers and press releases of IBM and other companies such as Honeywell. But there is already at least one tech company CEO floating the idea that the end of quantum volume’s usefulness might be in sight.
CEO Peter Chapman talked about how improvements in the reduction of noise could effectively lead to a high-fidelity, 32-qubit system with a quantum volume of approximately 4 million. Within 18 months, he suggested, quantum volume numbers could grow so large that researchers might need to rethink the definition of quantum volume to retain its usefulness.