(Optics.org) Researchers in a U.S. consortium led by the Department of Energy’s Fermilab, in Chicago, say they are moving closer to solving one of the biggest challenges posed by quantum computing: the “error factor.” They hope their work will help to open pathways to the high hopes for quantum computing that researchers have been pursuing for decades.
A federal grant of $115 million is funding work at Fermilab – a leading player in research on the peculiar behavior of qubits as a computational resource – and the other institutions in the consortium, called the Superconducting Quantum Materials and Systems Center, or SQMSC, to advance quantum computing.
“If we don’t deliver error correction, there will be no computing,” says Bane Vasic, a University of Arizona professor of electrical and computer engineering. “No communications. No nothing, without error correction.” Vasic is director of the University of Arizona’s Error Correction Laboratory, and an architect of cutting edge codes and algorithms used in communications industries and data storage.
Error in the quantum realm is any unwanted behavior that alters your information. “For example, in conventional computing an alpha particle could hit the silicon,” Vasic said. “It could destroy or flip your bit.”
“This era is like what happened when electrical engineering emerged within physics,” said Vasic. “Now quantum is everywhere. Now that the theory is established, engineering challenges need to be solved to translate it into reality. The concepts of quantum computation are very exciting and beautiful.”