Particle Physics Looks to Quantum Computing to Solve Big Data Problems

Display of a simulated particle collision event with a High-Luminosity Large Hadron Collider (HL-LHC) in an upgraded ATLAS detector. The event has an average of 200 collisions per particle packet crossing. (Credit: ATLAS/CERN Collaboration)

Large-scale physics experiments increasingly depend on big data and complex algorithms powered by powerful computers, and managing this growing mass of data presents its own unique challenges.

To better prepare for this deluge of data posed by next-gen upgrades and new experiments, physicists are turning to the nascent field of quantum computing to find faster ways to analyze incoming information.

Click on a name or photo below to learn more about quantum computing research projects conducted by early career researchers at Berkeley Lab:

Lucy Linder

Eric Rohm

Amitabh Yadav

In a conventional computer, memory takes the form of a large collection of bits, and each bit has only two values: one or zero, similar to an on or off position. In a quantum computer, data is stored in quantum bits, or qubits. A qubit can represent a one, a zero, or a mixed state in which it is both a one and a zero.

By exploiting this and other quantum properties, quantum computers have the potential to handle larger data sets and quickly solve some problems that would trip up even the world’s fastest supercomputers. For other kinds of problems, however, conventional computers will continue to outperform quantum machines.

The High Luminosity Large Hadron Collider (HL-LHC) project, a planned upgrade of the world’s largest particle accelerator at CERN’s laboratory in Europe, will go live in 2026. It will produce billions of particle events per second – five to seven times more data than its current maximum rate – and CERN is looking for new approaches to quickly and accurately analyze this data.

In these particle events, positively charged subatomic particles called protons collide, producing sprays of other particles, including quarks and gluons, from the energy of the collision. Particle interactions can also cause other particles to appear, such as the Higgs boson.

Tracking the creation and precise trajectories (called “traces”) of these particles as they pass through the layers of a particle detector – while ruling out the unwanted clutter or “noise” produced during these events – is essential. to analyze collision data.

The data will look like a giant 3D puzzle that contains many separate fragments, with little guidance on how to connect the dots.

To solve this next-generation problem, a group of student researchers and other scientists from the US Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) explored a wide range of new solutions.

One such approach is to develop and test a variety of algorithms suitable for different types of quantum computing systems. Their goal: to explore whether these technologies and techniques hold promise for reconstructing these particle trajectories better and faster than conventional computers.

Particle detectors work by detecting energy that is deposited in different layers of the materials in the detector. In analyzing the detector data, researchers work to reconstruct the trajectory of specific particles traveling through the detector array. Computer algorithms can aid this process through pattern recognition, and particle properties can be detailed by connecting individual “hit” points collected by the detector and correctly identifying individual particle trajectories.

Photo - A new wheel-shaped muon detector is part of the ATLAS detector upgrade at CERN.  This wheel-shaped detector is over 30 feet in diameter.  (Credit: Julien Marius Ordan/CERN)

A new wheel-shaped muon detector is part of the ATLAS detector upgrade at CERN. This wheel-shaped detector is over 30 feet in diameter. (Credit: Julien Marius Ordan/CERN)

Heather Gray, experimental particle physicist at Berkeley Lab and professor of physics at UC Berkeley, leads the Berkeley Lab-based R&D effort – Quantum Pattern Recognition for High-Energy Physics (HEP.QPR) – which seeks to identify quantum technologies to quickly perform this pattern recognition process in very high volume collision data. This R&D effort is funded under the DOE’s QuantISED (Quantum Information Science Enabled Discovery for High Energy Physics) portfolio.

The HEP.QPR project is also part of a larger initiative to stimulate research in quantum information science at Berkeley Lab and US National Laboratories.

Other members of the HEP.QPR group include: Wahid Bhimji, Paolo Calafiura and Wim Lavrijsen. Berkeley Lab postdoctoral researcher Illya Shapoval, who helped establish the HEP.QPR project and explored quantum algorithms for associative memory as a member of the group, has since joined a fundamental algorithmic research project for computer science quantum. Bhimji is a Big Data Architect at Berkeley Lab’s National Energy Research Computing Science Center (NERSC). Calafiura is chief software architect of CERN’s ATLAS experiment and a member of the Computational Research Division (CRD) at Berkeley Lab. And Lavrijsen is a CRD software engineer who is also involved in CERN’s ATLAS experiment.

HEP.QPR project members collaborated with researchers from the University of Tokyo and Canada on the development of quantum algorithms in high-energy physics, and jointly organized a mini-workshop on quantum computing at the Berkeley Lab in October 2019.

Gray and Calafiura also participated in a CERN-sponsored competition, launched in mid-2018, which challenged computer scientists to develop machine learning-based techniques to accurately reconstruct particle trajectories using a simulated HL-LHC dataset known as TrackML. Machine learning is a form of artificial intelligence in which algorithms can become more efficient and accurate through a gradual training process similar to human learning. Berkeley Lab’s quantum computing effort in particle track reconstruction also uses this TrackML set of simulated data.

Berkeley Lab and UC Berkeley play an important role in the rapidly evolving field of quantum computing through their participation in several quantum-focused efforts, including The Quantum Information Edge, a research alliance announced in December 2019.

Quantum Information Edge is a nationwide alliance of national laboratories, universities, and industries advancing the frontiers of quantum computing systems to meet scientific challenges and maintain American leadership in next-generation information technologies. It is led by DOE’s Berkeley Lab and Sandia National Laboratories.

The series of articles listed below profiles three student researchers who participated in the Berkeley lab’s efforts to apply quantum computing to the pattern recognition problem in particle physics:

Lucy Linderwhile working as a researcher at the Berkeley Lab, developed her master’s thesis – supervised by Berkeley Lab scientist Paolo Calafiura – on the potential application of a quantum computing technique called quantum annealing to find traces of particles. She remotely accessed quantum computing machines at D-Wave Systems Inc. in Canada and Los Alamos National Laboratory in New Mexico.

Linder’s approach was to first format the simulated particle tracking data as a problem known as the QUBO problem (quadratic unconstrained binary optimization) which formulated the problem as an equation with binary values : one or zero. This QUBO formatting also helped prepare the data for analysis by quantum annealing, which uses qubits to help identify the best possible solution by applying a physical principle that describes how objects naturally search for the lowest energy state. as low as possible. Read more.

Eric Rohman undergraduate student working under contract at the Berkeley laboratory as part of the DOE’s Undergraduate Science Laboratory Internship Program, developed a quantum approximate optimization algorithm (QAOA) using quantum computing resources at Rigetti Computing in Berkeley, California. He was supervised by Heather Gray, a physicist from the Berkeley laboratory.

This approach used a mix of conventional and quantum computing techniques to develop a custom algorithm. The algorithm, still being refined, was tested on the Rigetti Quantum Virtual Machine, a conventional computer that simulates a small quantum computer. The algorithm could possibly be tested on a Rigetti quantum processing unit equipped with real qubits. Read more.

Amitabh Yadavresearch associate student at Berkeley Lab since November and supervised by Gray and Berkeley Lab software engineer Wim Lavrijsen, is working on applying a quantum version of a conventional technique called Hough transformation to identify and reconstruct particle tracks using IBM’s Quantum Experience, a form of quantum computing.

The classic Hough transformation technique can be used to detect specific features such as lines, curves and circles in complex patterns, and the quantum Hough transform technique could potentially call more complex shapes from exponentially larger datasets. Read more.

# # #

Founded in 1931 on the belief that the greatest scientific challenges are best met by teams, Lawrence Berkeley National Laboratory and its scientists have been awarded 13 Nobel Prizes. Today, Berkeley Lab researchers are developing sustainable energy and environmental solutions, creating useful new materials, pushing the boundaries of computing, and probing the mysteries of life, matter, and the universe. Scientists around the world rely on the laboratory’s facilities for their own scientific discovery. Berkeley Lab is a multi-program national laboratory, operated by the University of California for the US Department of Energy’s Office of Science.

The DOE’s Office of Science is the largest supporter of basic physical science research in the United States and works to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

Comments are closed.