Too much quantum entanglement between output qubits prevents quantum machine learning from being trained
(EurekaAlert) A group of international researchers have discovered an important barrier that prevents quantum machine learning from being trained – too much quantum entanglement.
The hope is that in the future quantum neural networks will be able to combine the strengths of quantum computation and traditional neural networks, however, recent theory research points to potential difficulties.
Machine learning requires the algorithms to learn from the data in a phase known as training. During the training, the algorithm progressively improves in the given task. However, a large class of quantum algorithms are mathematically proven to experience only a negligible improvement due to a phenomenon known as a barren plateau, first reported by a team from Google in 2018. Experiencing a barren plateau can stop the quantum algorithm from learning.
The research further investigates the causes of barren plateaus with a new focus on the impact of too much entanglement. Entanglement of qubits – or quantum bits – is a quantum effect which allows for the exponential speedup of quantum computing power.
“While entanglement is necessary for quantum speedups, the research indicates the need for careful design of which qubits should be entangled and how much,” says research co-author Dr Maria Kieferova, Research Fellow at the ARC Centre for Quantum Computation and Communication Technology based at the University of Technology Sydney.
“This is in contradiction to the common understanding that more quantum entanglement provides faster speedups.’
“We have proven that excess entanglement between the output qubits, or visible units, and the rest of the quantum neural network hinders the learning process and that large amounts of entanglement can be catastrophic for the model,” says lead author Dr Carlos Ortiz Marrero, who is currently a Research Assistant Professor at North Carolina State University.
“This result teaches us which structures of quantum neural networks we need to avoid for successful algorithms.”