Upgrading the Hurricane Forecast: Predicting storm intensity with greater accuracy
|Tropical Storm Irene makes landfall near New York City. Courtesy of NOAA|
When Hurricane Irene swept through New England in August 2011, the National Hurricane Center (NHC) did an astounding job of predicting its path. However, Irene arrived significantly weaker than originally forecast, leading to a larger evacuation than would have occurred had NHC’s intensity forecasts been closer to the mark.
“The National Hurricane Center has been doing an excellent job over the past few decades of persistently increasing the hurricane forecast track accuracy,” said Fuqing Zhang, professor of meteorology at the Pennsylvania State University. “But, there have been virtually no improvements in the intensity forecast.”
Predicting how hurricanes form, intensify, or dissipate is different and more challenging than predicting their path.
Zhang, along with his Penn State colleague, Yonghui Weng, in collaboration with researchers at the National Oceanic and Atmospheric Administration (NOAA), has developed a pioneering hurricane modeling and forecasting system that improves on today’s operational methods in several important ways.
First, the system adopts a higher-resolution state-of-the-art computer model (4.5km grid spacing, compared to those of 9km) capable of more explicitly resolving a hurricane’s inner-core dynamics and structure, which is crucial to hurricane intensity.
Second, it ingests airborne Doppler radar taken by planes flying through the hurricane, helping to initialize the models. “NOAA has been collecting this data for 30 years, but it has yet to be put into operational models used by NHC forecasters,” Zhang said.
Third, a new data assimilation method using an ensemble Kalman filter (Zhang’s area of expertise) has improved how the system ingests Doppler radar data.
“We use statistical information about the background and merge it with the observations,” he explained. “By combining the forecast and the observations in a new optimal way, we greatly improved the model initial fields over the existing methods used by the NOAA operational hurricane models.”
Using the Ranger supercomputer at the Texas Advanced Computing Center (TACC), Zhang’s system forecast the track and intensity of every major storm in the Atlantic throughout the 2011 hurricane season, sharing the results of his simulations on his personal research Web site.
As a result of the changes, Zhang’s forecasts were shown to improve intensity predictions by an average of 20 to 40 percent over the NHC official forecasts for 2008-2011 storms that have the airborne Doppler radar data.
First deployed on Ranger at TACC during Hurricane Ike in 2008, Zhang’s prediction system is one of a handful being assessed by the NHC to become part of the operational forecasting system used in emergency situations. Those in the NHC feel that it provides many strong features that improve upon current methods.
“It may be one of our best hopes for trying to find models that can depict these kinds of changing conditions,” said James Franklin, chief of the Hurricane Specialist Unit at the NHC.
Frank Marks, director of NOAA’s Hurricane Research Division, said Zhang has radically changed the landscape in hurricane intensity prediction.
“The results of his work and the subsequent implementation of his approach in the operational models have demonstrated the first potential breakthrough in intensity prediction since the advent of the Dvorak technique for estimating the storm’s current intensity from satellite,” he said. “Fuqing’s tireless energy and drive has led to a major potential breakthrough.”
Just as several generations of jets are in development while earlier versions fly, the National Hurricane Center tests experimental methods concurrently with operational systems. To do so, they rely on advanced computing systems like Ranger at the Texas Advanced Computing Center, supported by the National Science Foundation.
The ensemble forecasts that Zhang produces require the hurricane simulations to run dozens of times, simultaneously, which requires thousands of computer processors working in parallel — a feat that can only be achieved on a few dozen supercomputing systems in the world.
“It would not be feasible to do any of this without the initial jumpstart by TACC and its computing resources,” Zhang said
Zhang’s research is part of NOAA’s Hurricane Forecast Improvement Project, a 10-year effort to make significant gains in extreme weather prediction. The National Science Foundation and the Office of Naval Research also support Zhang’s work.
Hurricane Irene may have been less intense than originally predicted, but it brought drenching rains and massive flooding throughout Vermont and much of New England.
“We still have problems in predicting the genesis, rapid intensification or weakening, and the eye-wall replacement cycle,” Zhang said. “We still have difficulties in quantifying the uncertainties associated with the prediction. The public wants to know how much weather they’ll get. But, given the inherent uncertainties, we want to include error bars.”
Improving these aspects of hurricane forecasting will require even more powerful supercomputers. With TACC’s Stampede system — approximately 20 times larger than Ranger — coming online in January 2013, Zhang will have the opportunity to put additional predictive power into his storm forecasts, a fact that galvanizes him.
“We’re having an impact in terms of what we do, and the effects on day-to-day life,” Zhang said. “And, of course, the nature of hurricanes always excites us.”