In 2012, the Large Hadron Collider (LHC), revealed a particle with predicted characteristics of the Higgs boson. This was a momentous achievement for physics by verifying theories about the Higgs field in the Standard Model of physics. Some naysayers wondered if there was nothing left to discover. But, it was only the beginning of more questions for researchers in high-energy physics, like Professors Anna Hasenfratz, Peter Boyle, and Oliver Witzel.
Delving into Beyond Standard Model
Dr. Hasenfratz is a professor of theoretical physics at University of Colorado Boulder (CU). Professor Boyle and Dr. Witzel are researchers in the School of Physics and Astronomy at the University of Edinburgh in Scotland. Together they are studying prototype systems that take them into the realms of the Beyond Standard Model (BSM) of particle physics. BSM theories explore the unknown aspects of physics that cannot be explained by the Standard Model. In particular, they are trying to understand the nature of the Higgs boson.
“We are interested in finding theories describing a composite Higgs particle, which could either be dilaton-like or a pseudo Nambu-Goldstone boson,” explained Dr. Witzel.
“We’re looking for models in which the Higgs particle itself might be made up of constituent particles in a bound state,” added Dr. Boyle. “But the effects of this substructure will only be visible at very, very high energy scales, and so it may be hard to see them without a much higher energy accelerator than we have access to.” The LHC might just not be big enough.
Hasenfratz, Boyle, and Witzel are working with quantum field theories, and in particular, leveraging the understandings in quantum chromodynamics, or QCD, to devise computer models that can simulate the makeup of a Higgs particle. QCD is a well-known, long-standing theory that describes the interactions of quarks and gluons, according to Hasenfratz. These are elementary particles that make up hadrons, like protons and neutrons. Quarks are held together in a proton by strong interactions, one of the four main interactions in particle physics along with electromagnetism, weak interaction, and gravitation.
Quarks with Flavor and Color
In physics, quarks have “flavors,” such as up and down, and forces between quarks have “color,” (thus the name, quantum chromodynamics). “A proton has an up, up, and down quark,” stated Boyle. “Neutrons have up, down, down. Maybe there are mutations of quarks we haven’t seen yet that together form bound states, like the Higgs. Maybe there are a larger number of quarks compared to the Standard Model.”
“So, we are speculating and simulating possibilities to see what an experimental signal would look like for the makeup of the Higgs boson,” added Witzel. “Then, maybe we can see it through experimentation.”
“We’re working on a general many-flavor model, which is, I believe, quite an attractive possible BSM description,” said Hasenfratz. “But, since we haven’t observed these particles, the best way to understand the properties of this model is through computer simulations. In QCD, the community has developed several different formulations to simulate the strong interactions of quarks and gluons. So, we build our many-flavor BSM model using methods of lattice QCD, or LQCD, a particular way to calculate strong interactions. We are not working on QCD; we are working on BSM. But, we use the principles of lattice QCD and apply them to BSM model simulations, which, are at present, the best, most reliable methods, because they address the physical questions, and then, they can be systematically improved.”
Creating Grid for QCD and BSM
For their simulations, Boyle and his collaborators created an LQCD code called “Grid” to run on supercomputer clusters. Grid was written to perform data parallel operations on cartesian grids. The grids can be distributed in parallel over multiple compute nodes, making it ideal for large clusters. While there are already LQCD codes available in languages, like FORTRAN, Boyle, with three colleagues at the University of Edinburgh, spent the last two years writing Grid in C++11 from the ground up because of features that the language offered them to simplify the code base from previous codes. “With Grid, we just write fairly high-level primitives for most of the operations,” said Boyle. “We created compile-time expression templates in just 200 lines of code, which allow us to easily change theories, rather than writing a code base that tries to cover every possible situation. So, for example, we can write a new template and quickly change from the same color theory of QCD to a five-color theory using a single compile-time switch.” Using C++11 has reduced the size of their code by ten times compared to existing code bases. “Grid makes a powerful code for doing novel explanations,” he concluded.
Boyle started out writing Grid to port and optimize existing codes for the Intel® Xeon Phi™ processor-based cluster in the university’s Intel Parallel Computing Center (IPCC). “We were quite successful with the initial work,” commented Boyle, “but we decided to move away from the existing codes and completely rewrite Grid in C++11. In the process, we spoke with a small compiler company, called Code Play. They suggested using techniques from the game industry. We met with a student working on a game project, and we white-boarded the challenges we both were facing. We discovered they were similar. I realized we could adopt what the game guys were doing and use the LLVM compiler. It worked beautifully. We are able to write high-level code using primitives and have the code run as fast as if it were written in low-level assembly language.”
Grid is written such that they can use it with many different vector-based methodologies, from earlier SSE to Intel® Advanced Vector Extensions 512 (Intel AVX-512. According to Witzel, the code runs about six times faster on their Intel Xeon Phi processor-based cluster than on Intel® Xeon® processor-based systems. “Grid has been run on several QCD clusters around the world for QCD simulations,” added Witzel. “People are excited to get the code.”
RMACC Summit Helps Answer BSM Questions
Professor Hasenfratz has started using Grid for her research in BSM theories. “We are basically doing Molecular Dynamics Monte Carlo simulations,” she said. “But our systems are large, four dimensional problems. The simulations are very time consuming. Working with Peter and Oliver, we had quick access to Grid, and my group became the first testers of this code for BSM theories. We are using it to investigate a very specific BSM system. Technically it’s an LQCD model with ten flavors. The project requires about 50 million core hours to run. Fortunately, we had Summit come online just recently.”
Rocky Mountain Advanced Computing Consortium (RMACC) Summit is a new supercomputer installed at CU that replaces their previous system, called Janus. RMACC Summit is supported by CU Boulder and Colorado State University (CSU) to enhance ongoing research and discovery in a variety of areas, such as Hasenfratz’s projects. RMACC Summit is built on Intel® Xeon® processor E5 and E7 family with additional Intel Xeon Phi processor nodes, all interconnected by Intel® Omni-Path Architecture (Intel® OPA) fabric.
“Oliver and Peter were able to compile the code specifically for our model, so we could start production runs,” added Hasenfratz. She has been running scaling tests on RMACC Summit, preparing for upcoming production runs in the near future. “We are very happy with the results we’re seeing so far,” she said.
“We are still at the beginning,” concluded Hasenfratz. “We are trying to understand properties of a system that has not been studied before. The model we are considering now has many promising properties, but at this point we are still asking only the most basic questions.”
For Hasenfratz, Witzel, and Boyle, there is much, much more to discover.
Ken Strandberg is a technical story teller. He writes articles, white papers, seminars, web-based training, video and animation scripts, and technical marketing and interactive collateral for emerging technology companies, Fortune 100 enterprises, and multi-national corporations. Mr. Strandberg’s technology areas include Software, HPC, Industrial Technologies, Design Automation, Networking, Medical Technologies, Semiconductor, and Telecom. Mr. Strandberg can be reached at email@example.com.