Japan is no stranger to extreme weather. Every year, much of the country faces a rainy season as well as the prospect of typhoons from August through October. At the same time, rising sea levels and weather uncertainties related to climate change are putting much of Japan’s densely populated cities and coastlines at risk from storm surges and extreme weather events. In 2015, for example, part of the country experienced unusually heavy rain that led to significant flooding and landslides that put millions of people in danger. And when you consider that the Tokyo metro area alone is home to more than 30 million people, you quickly start to understand why accurate weather forecasting in Japan is essential to disaster preparedness.
As the home of the Disaster Prevention Research Institute (DPRI), Kyoto University has contributed to Japan’s preparedness efforts since 1951. Recently, forecasting research at the DPRI got a boost when Kyoto University upgraded its supercomputing resources to include a Cray XC40 with Intel Xeon Phi™ Processors and a high-performance DDN SFA14K large-scale storage system. Now, DPRI researchers like Takeshi Enomoto, an Associate Professor in the Atmosphere and Hydrospheric Disasters department in the DPRI, can tap into processing power that is on par with leading numerical weather prediction (NWP) centers.
Understanding the mechanisms of high-impact weather
Professor Enomoto develops Atmospheric General Circulation Models (AGCM) for predicting atmospheric flows over the globe. “Essentially, we perform numerical experiments with an eye on helping humanity prepare for and mitigate meteorological disasters,” says Enomoto. “My goal is to understand the mechanisms and predictability of high-impact weather events. I would like to improve the forecast of high-impact weather for a month ahead.”
Anyone who deals with weather forecasting can appreciate the importance and complexity of this goal. A big part of the challenge is that the Earth system is so complex that researchers need to pull together knowledge and data from different disciplines to understand and simulate it.
“I work with researchers at a range of different institutions, including universities, the Meteorological Research Institute of the Japan Meteorological Agency (JMA), the Japan Agency for Marine-Earth Science and Technology and RIKEN. From an international perspective, we also collaborate with researchers at the U.S. National Centers for Environmental Prediction and at the European Centre for Medium-range Weather Forecasts (ECMWF). We all have different specialties, including tropical meteorology, physical oceanography and applied mathematics,” explains Enomoto.
When it comes to high-impact weather, typhoon tracking is of special importance to Enomoto. “While typhoons bring us water essential for life, they may cause meteorological disasters due to rain and wind so it is important to correctly forecast them,” he notes. “The good news is that over the years we’ve steadily reduced track errors to within about 80 km on average for a one-day forecast. The bad news is that some typhoons are difficult to predict, resulting in large track errors.”
Seeing steady progress with simulations
Professor Enomoto has seen steady progress in simulations and forecasting throughout his 16-year career. In 2002, he participated on a team conducting global atmospheric simulations at 10-km mesh using the Earth Simulator system, which had a peak performance of 40 teraflops and was the top ranked supercomputer in the world from 2002 – 2004. In fact, his team won the Gordon Bell Prize for Peak Performance in 2002 for its AFES code, which achieved 26.5 teraflops, or 65% of the Earth Simulator system peak that year. At the time, the horizontal resolution of operational weather prediction was roughly 60 km, so the Earth Simulator team made an important leap forward. It opened the door for higher resolution global atmospheric simulations that could identify some local meteorological hazards in some mesoscale (O(100 km)) weather events. But because the atmosphere is four dimensional and pinpointing potential local meteorological hazards in a storm takes very high resolution, researchers needed 1,000x more computing power (6 x 6 x 6 x 6).
Today, leading NWP centers, such as the ECMWF, the UK Met Office, the National Centers for Environmental Prediction (NCEP) and the JMA have the systems capable of delivering the high resolution necessary for more accurate local predictions. Moreover, the new Kyoto University XC40 system, which has a projected peak performance of 5.48 petaflops and is currently the second largest supercomputer in Japan, is comparable to the systems in the leading NWP centers. That means that Professor Enomoto and his colleagues at the DPRI can now conduct numerical experiments to investigate the cause of forecast failures using the operational models at a comparable resolution to what the NWP centers are using.
Analyzing forecast failures
Enomoto is particularly interested in misses related to typhoon track forecasts. He says that most errors are due to two things: uncertainty about initial weather conditions and model imperfection. “When we attempt to predict the future state of the atmosphere by analyzing current estimates produced from weather observations against previous forecasts, there is a lot of room for error,” explains Enomoto. “Our AGCM computer code is based on physical laws, and it is programmed to faithfully simulate atmospheric flows to the best of our knowledge. Model uncertainty arises either from imperfect knowledge or the simplification that’s needed to finish the forecast in reasonable time.”
To identify and correct errors, Enomoto compares different analyses prepared by different NWP centers and climate and NWP models. “We try to untangle the two causes of forecast error to improve both analysis and our model,” says Enomoto. Currently, for example, his team is comparing models from Typhoon Yagi, which occurred in 2013. The JMA forecast at the time predicted Yagi would land in Sikoku or Kyushu islands around 12 UTC on June 9th, but the landing didn’t happen because Yagi turned northeastward and weakened off the southern coast of Japan. Enomoto is investigating the impact of horizontal resolution using the ECMWF OpenIFS model. “I conducted the experiments using the model with 125 km mesh down to 40 km so far on a Mac Pro with Intel® Xeon® processors. Now I will use a supercomputer to read the operational resolution of ECMWF IFS, or 15 km mesh,” he explains.
Taking advantage of new technology
Since professor Enomoto is new to Kyoto University’s XC40 system, he is still learning how to tap into the full potential of the Xeon Phi processors. He expects the system will enable him to run OpenIFS workloads, such as the Typhoon Yagi forecast failure analysis, at higher resolutions. “The new system should run OpenIFS and other models at the operational resolution, which means that I will be able to conduct experiments that weren’t possible on our previous system,” says Enomoto.
The increased processing power in Kyoto University’s new XC40 system is only part of the benefit of using the system. Many of Professor Enomoto’s workloads are well suited for the manycore Intel Xeon Phi processors in the XC40 system. “With a well programmed parallel code, we can take advantage of the new processor easier than the previous system,” explains Enomoto. “Vector instructions can help improve performance dramatically,” he adds. “Often the Intel compiler can take care of vectorization so researchers can concentrate on the formulation rather than performance tuning.”
Professor Hiroshi Nakashima, chair of the Academic Center for Computing and Media Studies (ACCMS) Supercomputing Committee at Kyoto University, is confident Professor Enomoto will be able to make rapid progress with the new system. “The ACCMS strives to provide our users with supercomputers that take advantage of the most-advanced technologies, but we also understand that new technologies, such as the Xeon Phi processors on the XC40, pose new challenges for our users. We are happy that our new system is powering Professor Enomoto’s research on typhoons, and we expect that the combination of our experience in manycore and SIMD-vectorized computation along with our collaborative research programs will help his team to quickly advance its research even further,” explains Professor Nakashima.
Ultimately, Enomoto and his colleagues believe that the more precise knowledge from increasingly accurate forecast failure research will provide clues for improving the methods of analysis and overall forecasting models, eventually leading to improved forecasts. And every improvement is important since increased accuracy can help with everything from improving evacuation timing to reducing the costs of over-preparation for a typhoon, for example. Forecast failure analysis is also important to using computing resources more efficiently, given the huge workloads required to build NWP forecasts.