Texas Advanced Computing Center’s latest supercomputer powers transformative discoveries across science and engineering
You hear it before you see it — a roar like a factory in full production. But instead of cars or washing machines, this factory produces scientific knowledge.
Stampede, the newest supercomputer at the Texas Advanced Computing Center (TACC) and one of the most advanced scientific research instruments in the world, fills aisle after aisle of a new 11,000-square-foot data center at The University of Texas at Austin.
Through the machine room windows, you can see 182 racks holding more than 500,000 interconnected computer processors, working in parallel. Inside, wind whips from in-row coolers, wires snake up and over the racks, and chilled water courses below the floor as Stampede performs unprecedented calculations on behalf of scientists and engineers nationwide.
Staff at TACC — working closely with Dell and Intel engineers and University researchers — designed, built and, in January 2013, deployed Stampede for the U.S. open science community. TACC and The University of Texas at Austin competed against the top supercomputing centers and universities for the honor of being home to one of the most advanced systems in the world, and won. The award was funded by the National Science Foundation (NSF) as part of their commitment to transformational science. According to the November 2012 Top 500 list of supercomputers, Stampede is the seventh most powerful computer system on the planet. It is the most capable of the 16 high-performance computing (HPC), visualization and data analysis resources in the NSF-funded Extreme Science and Engineering Discovery Environment (XSEDE) program, and as an instrument dedicated to academic research, Stampede is the most powerful in the U.S. on the list, capable of outperforming a small city’s worth of personal computers.
“This is a proud moment for The University of Texas,” said UT President William Powers at Stampede’s dedication ceremony in March. “Stampede will serve UT and the nation as it enables scientific exploration that would not otherwise be possible, and it continues TACC’s tradition of providing powerful, comprehensive and leading-edge advanced computing technologies to the open science community.”
Like its name suggests, Stampede harnesses the power of a half-million computer processors and combines them to trample the largest and most challenging computational problems. Stampede’s peak performance currently tops out at nearly 10 petaflops, or 10 quadrillion floating-point operations per second. Sixteen times more powerful than the Ranger system it replaces (which was still the 50th fastest supercomputer in the world when it was decommissioned in February), Stampede will enable scientists to address new classes of problems they have never been able to approach before.
Enabling New Explorations
In recent years, supercomputers have become one of the most critical general-purpose instruments for conducting scientific research. Known as the “third pillar” of science, computer simulations and models complement theory and experimentation and allow researchers to explore phenomena that cannot be captured via observation or laboratory experiments. In addition, supercomputers like Stampede now enable a “fourth pillar” — data-driven science — by integrating simulation and data-intensive applications to help researchers make new discoveries. The capabilities offered by Stampede are broad and help researchers across all areas of science, engineering, the social sciences and the humanities solve problems with far-reaching implications.
- Geosciences
The accelerating flow of ice streams from Antarctica has the potential to raise sea levels significantly, impacting millions of people. However, Antarctica’s ice rises thousands of feet above the ground, and it’s unclear what lies underneath and how the topography influences ice streams. Drilling ice samples throughout the continent is unfeasible, so Omar Ghattas and his team at The University of Texas at Austin have been using Stampede to better understand and represent the base of the ice sheet and the flow of ice from Antarctica into the sea.
Using a new, high fidelity, numerical ice flow model, these researchers are running thousands of high-resolution continental-scale simulations on Stampede. Each simulation fine-tunes the unknown parameters — the friction and other parameters at the base of the ice — and hones the model so it better reproduces the observed ice flow on the top surface of the ice. Problems of this kind can be solved only on supercomputing systems like Stampede.
“How often does a scientist get an instrument that’s 50 times as powerful as the one it replaces?” Ghattas exclaimed. “It’s a massive step.”
- Neuroimaging
Brain imaging is another important example of an area where powerful supercomputers can play an important role. When it comes to gliomas, a type of primary brain tumor, even the most experienced doctors often disagree on the best approach to treatment. Surgeons need to understand how aggressive the tumor is to be able to plan for surgery, radiotherapy and other treatment options; however, it is difficult to determine the full extent of a tumor’s invasion into normal tissue without causing damage to the patient in the process.
“Just looking at a single MRI scan is not enough,” said George Biros, a professor of mechanical engineering and computer science at The University of Texas at Austin and a two-time winner of the Gordon Bell Prize. “We need to combine images acquired using several imaging modalities, apply pattern recognition and statistical inference tools and integrate them with sophisticated biophysical models to be able to quantitatively interpret the images.”
Working with Christos Davatzikos, a professor of radiology at the University of Pennsylvania School of Medicine, Biros is creating new methods to use supercomputers to quickly and accurately assimilate MRI scans and other imaging modalities, and then to combine these images with biophysical models that represent tumor growth. It’s a “Big Data” problem with the added complexity of simulations and real-time conditions, and they’ve found that the addition of these math-driven biophysical tumor models increases the accuracy and effectiveness of the interpretation of images, which in turn allows surgeons to make more informed decisions about treatment options.
“A machine like Stampede makes this possible,” Biros said.
- Earth Science
Another early scientific success story on Stampede concerns efforts to predict the likelihood of earthquakes throughout California. Researchers from the Southern California Earthquake Center (SCEC) used Stampede to forecast the frequency of damaging earthquakes in California for the latest Uniform California Earthquake Rupture Forecast.
An Earthquake Rupture Forecast determines the probability of all possible damaging earthquakes throughout a region and over a specified time span. To prepare the new California forecast, the SCEC group ran hundreds of thousands of data inversions to explore the full range of slip rates, earthquake magnitudes and locations. The data inversions included the history of earthquake activity throughout California and the fact that faults relate and react to each other during an earthquake. These inversions were then combined to provide a comprehensive view of earthquake risk in the region.
The results of the simulations on Stampede, once approved, will be incorporated into the U.S. Geological Survey’s National Seismic Hazard Maps, which are used to set building codes and insurance rates.
“We do a lot of high-performance computing calculations,” said Thomas Jordan, director of SCEC, “but it’s rare that any of them have this level of potential impact.”
- Nanomaterials
Another critically important research area involves the creation of new materials to minimize climate change. Imagine it were possible to remove carbon dioxide (CO2) from a smokestack and turn it into something useful. Alexie Kolpak, a professor of mechanical engineering at MIT, is using Stampede to explore a class of nanomaterials that can capture CO2 from an exhaust stream and convert it into cyclic carbonates, a useful and valuable product for industry.
She’s investigating a nanomaterial that combines a thin film oxide and a ferroelectric substance that changes structure when presented with an electric charge. In one state, it captures CO2; in another state, it releases the CO2 so it can react to form cyclic carbonates. These reactions occur at the atomic level, beyond the reach of even the most powerful microscopes, and can only be investigated through computer simulations.
The computational experiments Kolpak and her collaborators perform on Stampede inspire new ideas for how to manipulate the surface of substances to do important tasks — like clean the air or convert sunlight into energy — more efficiently than ever before. They then create and test the materials in the laboratory to see how they perform on real-world applications.
“It’s a really exciting time,” Kolpak said. “We’re getting to the point, with the computational speed and the development of algorithms, that we’re looking at a lot more realistic systems, and that’s going to help guide the research that’s going on in the energy areas and in other areas, too.”
A Growing Base of Users across Disciplines
In the past, supercomputers were often used for a small subset of science and engineering problems, but increasingly systems like Stampede do more than difficult calculations. Stampede lets scientists simulate, visualize, analyze, store and share their knowledge with others around the world. It’s an all-in-one computing ecosystem for multi-disciplinary, multi-institutional science and, without it, the research community as a whole would be without the resources it needs to make new discoveries.
And supercomputing’s not only for the hard sciences anymore. Researchers in the social sciences, digital humanities and arts are using Stampede to enable new discoveries, too.
Hundreds of thousands of hours of important spoken text audio files — dating back to the 19th century — are only marginally accessible for listening and almost completely inaccessible for new forms of analysis and instruction in the digital age. These files, which comprise poetry readings, interviews of folk musicians, artisans and storytellers, and stories by elders from tribal communities, often contain the only recordings of significant literary figures and bygone oral traditions.
In 2013 to 2014, the School of Information at The University of Texas at Austin and the Illinois Informatics Institute at the University of Illinois at Urbana-Champaign will host two High Performance Sound Technologies for Access and Scholarship (HiPSTAS) workshops to explore how new technologies and methods can open up unique avenues of inquiry into these documents.
Stampede will provide high performance computing, large-scale visualization and massive storage capabilities to sound archivists at the conference to help them search for patterns and gain insights into spoken language and music. Several large sound archives will be available to the researchers on Stampede, including recordings of Ojibwe elders discussing traditional knowledge, speeches by Lyndon Baines Johnson and Lady Bird Johnson, and an oral history of the oil boom in Texas. Stampede enables these researchers to analyze massive sound archives at unprecedented speeds, increasing the pace of new discoveries.
“Humanists interested in sound scholarship, stewards of sound collections, and computer scientists and technologists versed in computational analytics and visualizations of sound will develop more productive tools for advancing scholarship in spoken text audio if we consider the needs, resources and possibilities of developing a digital infrastructure for the study of sound together,” said Tanya Clement, one of the conference organizers and professor of Information Science at The University of Texas at Austin.
What Will Stampede Enable Next?
Stampede is arguably one of the most advanced technological triumphs of the early 21st century, but it’s only as powerful as the scientists who use it. During its first five months of operations, more than 6200 scientists used the system for over 700 projects. Requests for time on Stampede — which is free to the open science community and allocated through a peer review process — outstripped available time on the system by a factor of two.
This rapid adoption is a testimony to Stampede’s broad appeal to the scientific community and the system’s ease-of-use. Stampede will continue to grow as new and upgraded components are added and, in its lifetime, the system is expected to deliver the equivalent of more than 400,000 years of computing to tens of thousands of scientists.
Imagine the discoveries Stampede will enable across all fields of knowledge.
Aaron Dubrow is External Relations Manager and Science Writer at Texas Advanced Computing Center. He may be reached at editor@ScientificComputing.com.