James Proudfoot regularly shares his technical expertise for significant global scientific endeavors. To date, his efforts have supported many projects at Fermilab, his current home base at Argonne National Laboratory, and CERN’s Large Hadron Collider Project (LHC). At CERN in 2012, Proudfoot assisted the team in their successful pursuit of the Higgs Boson. Today, he works with CERN colleagues from around the globe in the ongoing effort to understand subatomic particle physics.
“Today, we are evaluating interactions among particles which are not part of our current understanding,” said Proudfoot. “We refer to that work as physics beyond the standard model.”
Proudfoot’s current focus centers on one of CERN’s major detector systems called the Tile Calorimeter. In addition to his role in operations maintenance of the currently operating detector, he works on the design of particle interaction tracking systems. The system seeks to handle increased beam intensities with high luminosity. Another role taken on by Proudfoot involves designing device cooling systems necessary to support the powerful tracking devices.
“I picked the rather ‘unglamorous’ project of designing parts of the cooling system which operates behind the scenes,” said Proudfoot. “However, it’s also a gratifying endeavor since that work is an integral part of the detector’s ability to do its job. The cooling system plumbing involves welding tiny diameter titanium tubes together. It’s high-precision work without room for error.”
CERN’s Atlas Detector
In addition to the Higgs Boson discovery process, Proudfoot and other scientists from around the globe at CERN seek other exotic particles like partners to the Z boson, top quark, and other yet-unidentified particles. Doing so involves the ATLAS Detector, the largest and most potent of the two general-purpose detectors at the LHC facility. In his role, Proudfoot is a man in the inside—literally—when working on the maintenance of the Tile Calorimeter detector system.
“Putting it mildly, the ATLAS detector is quite a big device. It sits in a cavern 60 meters long, 40 meters high, and 40 meters broad,” said Proudfoot. “ATLAS measures the practical properties produced from proton collisions at the heart of the detector. The detector has about 100 million channels and reads data at 40 megahertz. After selecting events of interest that becomes about 100,000 operations on the detector every second. Without an equally substantial compute infrastructure supporting the process, the data collection process is impossible.”
Proudfoot and the CERN team depend on their extremely powerful computing systems to accomplish the task of capturing, reconstructing, analyzing, and visualizing the data.
“There are many challenges inherent in this project” said Proudfoot. “Event simulation is one fundamental challenge we are addressing. We capture such a large quantity of data that we’re stressing the limits of precision involved with the theoretical calculations. We measure data volumes in petabytes per year, and that volume constantly increases. While advanced compute systems re-process the information and store it, human intervention remains a critical element.”
“Our human brains have to do quite a bit of work as well,” joked Proudfoot. “We must invent better algorithms to analyze the data and tease out the needed information. Algorithms have limitations, often due to computing power, and we always seek to refine them to accommodate more data and process it in more focused ways. Data collection is a time-consuming process extending over years of operation of the ATLAS detector and the Large Hadron Collider, plus we work with very high-intensity beams and radiation, so the systems must operate perfectly and extremely efficiently to accomplish their jobs.”
The development of exascale computing will have a profound impact on the team’s work. It will enable the use of parallel operations, allowing the ability to process vast amounts of data in days or hours, what previously could even have required years in the case of some of the most precise event simulations.
Exascale enabling breakthroughs at Argonne
Since 2015, Argonne researchers have been exploring the use of supercomputers at the Argonne Leadership Computing Facility (ALCF) to meet the LHC’s growing computing demands. The already-massive amount of data produced at the LHC is expected to grow significantly when CERN transitions to the High-Luminosity LHC, a facility upgrade being carried out now for operations planned in 2026.
Exascale systems, like the Aurora supercomputer scheduled to arrive at Argonne in 2021, will not change the fundamental ways in which researchers use HPC today. Many researchers already rely on machine learning algorithms and deep neural nets. However, exascale machines will allow scientists to explore vast volumes of data with higher precision and in a much more timely fashion.
“At Argonne, we seek solutions that allow us to take the workflow used in our current processes, combine it with the millions and millions of lines of code we have collected over the last couple of decades, and get them to fit in a way in which we can get meaningful answers out of the experiment,” said Proudfoot.
To prepare for the scale and architecture of Aurora, Proudfoot is leading an ALCF Early Science Program project focused on developing exascale workflows, algorithms, and machine learning capabilities to enable the team to use the system effectively in their search for new physics discoveries. The technology underlying the new system will offer levels of computing power previously inaccessible to researchers. “Many scientists are eager to run experiments on Aurora, and of course, some lucky person will be the first to experience its power,” said Proudfoot. “Once online, Aurora will allow us to shape new ideas, explore into the world outside, combine ideas from other sciences, and share that knowledge to inspire new algorithms. Perhaps we will also uncover new science. That would be the most ambitious goal with exascale computing, and of course, human inspiration will help us pursue that path.”
“We are thankful for partners like Intel and Cray that bring both human skills and technology to the table to make the Aurora system possible,” said Proudfoot. “A future generation of Intel Xeon Scalable processors, next generation Intel Optane DC persistent memory, and technologies based on a new Intel Xe architecture underlying Aurora will help us push the boundaries of the exascale computing envelope. Cray’s software stack ties the components together to make a fully integrated system possible. Together, we are making a new era of exascale computing possible. It’s an exciting time for computer science!”
“We are thrilled to have the nation’s first exascale machine here at Argonne and have the opportunity to develop programs to run on it. Throughout my career, I have worked on several bespoke projects, and there is no greater intellectual reward than an opportunity like that,” Proudfoot shared enthusiastically. “Aurora will allow us to look at alternative types of algorithms and do things in days which previously could take many months. We need our brains to keep pace with Aurora’s speed and use that inspiration to look beyond standard models in pursuit of discoveries.”
A future of optimism
“We’re always looking for hints of something we have not discovered,” noted Proudfoot. “The opportunity to work with scientists from Asia, Russia, South America, Europe, and other countries means everyone comes to the table with new ideas and new approaches to explore. We all share a common goal of trying to understand the underlying properties of particle interactions. I feel lucky to have such a fabulous experience at this point in my life.”
Describing his role, Proudfoot takes a humble view of his accomplishments. “I see myself as a fundamental research scientist looking at the physical properties of our universe. Alongside my colleagues, we strive to produce knowledge and understanding. All the scientists, graduate students, and others working with CERN gain an enriched experience from studying in this unique environment. I hope our work inspires people, especially the next generation of scientists who will build upon — and further — the work we can do in our lifetimes.”
Rob Johnson spent much of his professional career consulting for a Fortune 25 technology company. Currently, Rob owns Fine Tuning, LLC, a strategic marketing and communications consulting company based in Portland, Oregon. As a technology, audio, and gadget enthusiast his entire life, Rob also writes for TONEAudio Magazine, reviewing high-end home audio equipment.
This article was produced as part of Intel’s HPC editorial program, with the goal of highlighting cutting-edge science, research and innovation driven by the HPC community through advanced technology. The publisher of the content has final editing rights and determines what articles are published.