Industries and research institutions that study or use seismology, the scientific study of earthquakes, or seismic imaging, a geophysical technique that investigates the earth’s subsurface, have made significant progress in the last several years thanks to advances in high-performance computing (HPC) technologies. The increase in sheer compute power behind HPC systems has given researchers and scientists unprecedented insight into how the earth works.
Especially when we consider the devastating effects of earthquakes or the increased global interest in more efficient oil and gas exploration, the importance of processing and analyzing massive amounts of data with large-scale computing technology becomes vitally important. With the technological improvements of HPC, researchers are able to achieve results that were before thought to be impossible. They can now conduct more complex simulations with larger computational requirements, and process and archive huge volumes of raw seismic data; or run geographic simulations in rough or unchartered territory. As a result, scientists studying earthquakes can provide life-saving recommendations on how to mitigate damage resulting from seismic and volcanic events.
For example, The Earthquake Research Institute (ERI) at the University of Tokyo conducts research for scientific understanding of various phenomena related to seismic and volcanic activities on Earth, as well as for mitigating disasters derived from such phenomena. In mid-2015, ERI implemented SGI’s large-scale parallel computing solution and achieved dramatic improvements in performance and scale to fulfill the wide variety of research demands to prevent and mitigate seismic and volcanic disasters.
By deploying GPU-accelerated SGI servers, the Department of Geosciences at Princeton University has been able to drive next-generation earthquake research and simulate seismic activity 32 times faster than with its previous technology, resulting in significant strides in its understanding of seismic wave propagation.
Energy companies can find and extract oil from increasingly remote locations with greater accuracy, by using seismic processing to simulate what is under the Earth’s surface and the complexities that lie below, resulting in cost savings and reduced environmental impact. Additionally, reservoir simulation helps provide insight into the fluid movement in the potential reservoir. Oil drilling can be very expensive, so by using innovative technologies with the help of HPC, organizations can analyze massive amounts of data and improve the probability of success before ever drilling; saving time, money and resources.
Supercomputers have become a crucial tool for energy companies’ research, discovery and extraction operations. Global energy supermajor Total updated the computational power of its Pangea supercomputer earlier this year with an equivalent processing power of approximately 80,000 personal computers and storage capacity equivalent to approximately 27 million compact discs (CDs). As a result, Total has optimized its reservoir simulation and drilling operations resulting in significant time and cost savings.
Along the same vein, computing technology solutions from SGI have helped Malaysian oil and gas giant PETRONAS reduce the time needed to analyze voluminous quantities of seismic data with much more precise images of the subsurface.
As every industry amasses more and more data, we have the opportunity to understand the world around us with increasing accuracy. Without the technologies that can keep up with the growth of data and the computational power needed to analyze and process this data, we would understand a smaller percentage of the data that is available to us, and would limit our understanding of how our world works. HPC can help improve this understanding to enable solving some of the world’s most challenging problems.
Jorge Titinger is president and CEO of SGI.