Display of a simulated High-Luminosity Large Hadron Collider (HL-LHC) particle collision event in an upgraded ATLAS detector. The event has an average of 200 collisions per particle bunch crossing. (Credit: ATLAS Collaboration/CERN)
Giant-scale physics experiments are increasingly reliant on big data and complex algorithms fed into powerful computers, and managing this multiplying mass of data presents its own unique challenges.
To better prepare for this data deluge posed by next-generation upgrades and new experiments, physicists are turning to the fledgling field of quantum computing to find faster ways to analyze the incoming info.
In a conventional computer, memory takes the form of a large collection of bits, and each bit has only two values: a one or zero, akin to an on or off position. In a quantum computer, meanwhile, data is stored in quantum bits, or qubits. A qubit can represent a one, a zero, or a mixed state in which it is both a one and a zero at the same time.
By tapping into this and other quantum properties, quantum computers hold the potential to handle larger datasets and quickly work through some problems that would trip up even the world’s fastest supercomputers. For other types of problems, though, conventional computers will continue to outperform quantum machines.
The High Luminosity Large Hadron Collider (HL-LHC) Project, a planned upgrade of the world’s largest particle accelerator at the CERN laboratory in Europe, will come on line in 2026. It will produce billions of particle events per second–five to seven times more data than its current maximum rate–and CERN is seeking new approaches to rapidly and accurately analyze this data.
In these particle events, positively charged subatomic particles called protons collide, producing sprays of other particles, including quarks and gluons, from the energy of the collision. The interactions of particles can also cause other particles – like the Higgs boson – to pop into existence.
Tracking the creation and precise paths (called “tracks”) of these particles as they travel through layers of a particle detector – while excluding the unwanted mess, or “noise” produced in these events – is key in analyzing the collision data.
The data will be like a giant 3D connect-the-dots puzzle that contains many separate fragments, with little guidance on how to connect the dots.
To address this next-gen problem, a group of student researchers and other scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have been exploring a wide range of new solutions.
One such approach is to develop and test a variety of algorithms tailored to different types of quantum-computing systems. Their aim: Explore whether these technologies and techniques hold promise for reconstructing these particle tracks better and faster than conventional computers can.
Particle detectors work by detecting energy that is deposited in different layers of the detector materials. In the analysis of detector data, researchers work to reconstruct the trajectory of specific particles traveling through the detector array. Computer algorithms can aid this process through pattern recognition, and particles’ properties can be detailed by connecting the dots of individual “hits” collected by the detector and correctly identifying individual particle trajectories.
The HEP.QPR project is also part of a broader initiative to boost quantum information science research at Berkeley Lab and across U.S. national laboratories.
Members of the HEP.QPR project collaborated with researchers at the University of Tokyo and from Canada on the development of quantum algorithms in high-energy physics, and jointly organized a Quantum Computing Mini-Workshop at Berkeley Lab in October 2019.
Gray and Calafiura were also involved in a CERN-sponsored competition, launched in mid-2018, that challenged computer scientists to develop machine-learning-based techniques to accurately reconstruct particle tracks using a simulated set of HL-LHC data known as TrackML. Machine learning is a form of artificial intelligence in which algorithms can become more efficient and accurate through a gradual training process akin to human learning. Berkeley Lab’s quantum-computing effort in particle-track reconstruction also utilizes this TrackML set of simulated data.
Berkeley Lab and UC Berkeley are playing important roles in the rapidly evolving field of quantum computing through their participation in several quantum-focused efforts, including The Quantum Information Edge, a research alliance announced in December 2019.
The Quantum Information Edge is a nationwide alliance of national labs, universities, and industry advancing the frontiers of quantum computing systems to address scientific challenges and maintain U.S. leadership in next-generation information technology. It is led by the DOE’s Berkeley Lab and Sandia National Laboratories.
The series of articles listed below profile three student researchers who have participated in Berkeley Lab-led efforts to apply quantum computing to the pattern-recognition problem in particle physics:
Lucy Linder, while working as a researcher at Berkeley Lab, developed her master’s thesis–supervised by Berkeley Lab staff scientist Paolo Calafiura–about the potential application of a quantum-computing technique called quantum annealing for finding particle tracks. She remotely accessed quantum-computing machines at D-Wave Systems in Canada and at Los Alamos National Laboratory in New Mexico.
Linder’s approach was to first format the particle-track simulated data as something known as a QUBO (quadratic unconstrained binary optimization) problem that formulated the problem as an equation with binary values: either a one or a zero. This QUBO formatting also helped prepare the data for analysis by a quantum annealer, which uses qubits to help identify the best possible solution by applying a physics principle that describes how objects naturally seek the lowest-possible energy state. Read more.
Eric Rohm, an undergraduate student working on a contract at Berkeley Lab as part of the DOE’s Science Undergraduate Laboratory Internship program, developed a quantum approximate optimization algorithm (QAOA) using quantum-computing resources at Rigetti Computing in Berkeley, CA. He was supervised by Berkeley Lab physicist Heather Gray.
This approach used a blend of conventional and quantum computing techniques to develop a custom algorithm. The algorithm, still in refinement, has been tested on the Rigetti Quantum Virtual Machine, a conventional computer that simulates a small quantum computer. The algorithm may eventually be tested on a Rigetti quantum processing unit that is equipped with actual qubits. Read more.
Amitabh Yadav, a student research associate at Berkeley Lab since November who is supervised by Gray and Berkeley Lab software engineer Wim Lavrijsen, is working to apply a quantum version of a convention technique called Hough transform to identify and reconstruct particle tracks using IBM’s Quantum Experience, a form of quantum computing.
The classical Hough transform technique can be used to detect specific features such as lines, curves, and circles in complex patterns, and the quantum Hough transform technique could potentially call out more complex shapes from exponentially larger datasets. Read more.
NERSC is a DOE Office of Science User Facility.
Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.
Comments
Guillermo Valdes Mesasays
Because we do not see that we are working on the limits of human knowledge because the one who does quantum computing should know that the transition from classical to quantum computing must solve the problem that involves the transition from quantum to classical physics,
Because we do not see that we are working on the limits of human knowledge because the one who does quantum computing should know that the transition from classical to quantum computing must solve the problem that involves the transition from quantum to classical physics,
You are correct and mistaken. There is no difference between quantum and classical….they are all the result of quantum and should be viewed that way…..
The reality is that you need to understand that Quantum effects have no time lag….that _something_ is being built by the interactions and what needs to be detected is the most obvious DIFFERENT change….what is the smallest thing that is changing immediately that isn’t changing everywhere else….see also topological searches…. satellite reconnaissance of a surface.
You’re looking for peaks, hot spots, updrafts….levels of change…forget the minutia
From my point of view it goes beyond that for example you apply the quantum phenomena when through a medical device it applies heat, light, sound that is teranosic on cancer cells, you need to know the type of phonon, the interactions of phonons with for example more other couplings such as spin-orbit, excitons etc
You’re not actually dealing with the totality of what is involved in this specific cancer cell(s) _event_ when you are dealing with the physical non-holistic existence of the cancer cell….you’re dealing with classical physics while hand waving….
better you should hire a medicine man…..at least they know how to hand wave
Guillermo Valdes Mesa says
Because we do not see that we are working on the limits of human knowledge because the one who does quantum computing should know that the transition from classical to quantum computing must solve the problem that involves the transition from quantum to classical physics,
Because we do not see that we are working on the limits of human knowledge because the one who does quantum computing should know that the transition from classical to quantum computing must solve the problem that involves the transition from quantum to classical physics,
William Tucker says
You are correct and mistaken. There is no difference between quantum and classical….they are all the result of quantum and should be viewed that way…..
wiliam tucker says
The reality is that you need to understand that Quantum effects have no time lag….that _something_ is being built by the interactions and what needs to be detected is the most obvious DIFFERENT change….what is the smallest thing that is changing immediately that isn’t changing everywhere else….see also topological searches…. satellite reconnaissance of a surface.
You’re looking for peaks, hot spots, updrafts….levels of change…forget the minutia
Guillermo Valdes Mesa says
From my point of view it goes beyond that for example you apply the quantum phenomena when through a medical device it applies heat, light, sound that is teranosic on cancer cells, you need to know the type of phonon, the interactions of phonons with for example more other couplings such as spin-orbit, excitons etc
William Tucker says
Quantum effects on a cancer cell?
That actually makes no sense….
You’re not actually dealing with the totality of what is involved in this specific cancer cell(s) _event_ when you are dealing with the physical non-holistic existence of the cancer cell….you’re dealing with classical physics while hand waving….
better you should hire a medicine man…..at least they know how to hand wave