Quantum News Briefs: November 22, 2023: University of Michigan Researchers Awarded Gordon Bell Prize; University of Warsaw Research Looks at Quantum Backflow; Texas A&M Scientists Combine Quantum Computing and Genetics; and MORE!
Quantum News Briefs: November 22nd, 2023:
Quantum Material Simulation with Wins 2023 ACM Gordon Bell Prize
The prestigious Gordon Bell Prize was recently awarded to a team led by the University of Michigan for their revolutionary advancement in materials modeling. It has brought quantum mechanical accuracy to large systems within the capability of today’s supercomputers. This breakthrough, spearheaded by Professor Vikram Gavini, bridges the gap between Quantum Many-Body (QMB) methods and Density Functional Theory (DFT), overcoming traditional constraints in modeling quantum wavefunctions of multiple electrons. This method significantly impacts the computational design of materials across diverse fields, including the development of better alloys, catalysts, and drugs. Among their notable achievements was the calculation of dislocation in magnesium with yttrium solute atoms, involving approximately 620,000 electrons, on the Frontier Exascale supercomputer. This computation achieved an extraordinary performance of 660 petaflops. The Association for Computing Machinery recognition underscores the team’s contribution to advancing predictive modeling in materials physics and heralds a new era in Exascale computing. The team, which includes members from the University of Michigan and collaborators from the Indian Institute of Science and Oak Ridge National Laboratory, has been lauded for their multi-year effort and dedication to this groundbreaking work.
University of Warsaw Research makes progress toward observing quantum backflow in two dimensions
In a significant study published in Optica, researchers from the University of Warsaw’s Faculty of Physics have demonstrated a unique optical phenomenon known as “azimuthal backflow.” Led by Dr. Radek Lapkiewicz, the team achieved this by superposing two light beams twisted in a clockwise direction, resulting in anti-clockwise twists in the resultant dark regions. This discovery, an optical analog of the quantum backflow phenomenon, holds significant implications for studying light-matter interactions and advances our understanding of quantum mechanics. The research, which utilized a sophisticated Shack-Hartman wavefront sensor for detailed spatial measurements, represents a crucial step towards observing quantum backflow in two dimensions, with potential applications in fields ranging from optical trapping to the development of ultra-precise atomic clocks.
Researchers use quantum computing to predict gene relationships
Researchers at Texas A&M University have made a significant breakthrough in genetic research by employing quantum computing, as detailed in their study published in npj Quantum Information. The team successfully used quantum computing to map gene regulatory networks (GRNs), revealing complex gene interactions previously undetected with traditional computing methods. This novel approach allowed them to predict gene relationships more accurately, offering profound implications for human and animal medicine. Quantum computing’s capability to handle complex data, like the simultaneous active and inactive states of genes, provides a more comprehensive understanding of how genes influence each other. This advancement in computational biology, bridging physics and biology, opens new avenues for exploring cellular processes and disease mechanisms, potentially transforming approaches in medical research and treatment.
In Other News, Forbes article: “Quantum Artificial Intelligence Is Closer Than You Think”
A recent Forbes article claims that the convergence of artificial intelligence (AI) and quantum computing is poised to significantly transform the technology industry and revolutionize business innovation. The rapid evolution of generative AI, especially since late 2022, has demonstrated its remarkable capabilities in creating human-like text and graphics. Despite the advancements in AI, it’s currently limited by the processing power achievable with silicon-based hardware. However, quantum computing, a new method of processing information derived from quantum mechanics, promises to overcome these limitations. Quantum computers use qubits, capable of representing both 1 and 0 simultaneously, offering potentially millions of times faster processing than current microchips. Major tech companies like IBM, Microsoft, and Google are already exploring practical applications of quantum computing in fields such as pharmaceuticals, cybersecurity, and weather forecasting. The integration of AI with quantum computing, known as Quantum Artificial Intelligence (QAI), is expected to greatly enhance the capabilities of AI, leading to improved speed, efficiency, and accuracy. This impending combination will likely impact almost every industry, urging organizations to prepare and adapt for this imminent technological shift.
In Other News, IBM Quantum Blog: “Updating how we measure quantum quality and speed”
In a new blog, IBM Quantum discusses two new metrics, Error Per Layered Gate (EPLG) and CLOPSh, which have been introduced to measure the performance of quantum processors with over 100 qubits. These metrics are designed to provide a more comprehensive understanding of quantum processor capabilities, addressing traditional Quantum Volume measurement limitations. EPLG offers insights into the average error for each gate in layered circuits, enhancing the understanding of individual qubits, gate, and crosstalk performance. CLOPSh, an updated version of the Circuit Layer Operations Per Second (CLOPS) metric, reflects real-world hardware constraints, focusing on the parallel execution of two-qubit gates within the system architecture. These innovations mark a significant step forward in benchmarking quantum computers, particularly in the era of utility-scale quantum computing.