Seafaring vessels and offshore platforms endure a constant battery of waves and currents. Over decades of operation, these structures can, without warning, meet head-on with a rogue wave, freak storm, or some other extreme event, with potentially damaging consequences.
Now engineers at MIT have developed an algorithm that quickly pinpoints the types of extreme events that are likely to occur in a complex system, such as an ocean environment, where waves of varying magnitudes, lengths, and heights can create stress and pressure on a ship or offshore platform. The researchers can simulate the forces and stresses that extreme events—in the form of waves—may generate on a particular structure.
Compared with traditional methods, the team’s technique provides a much faster, more accurate risk assessment for systems that are likely to endure an extreme event at some point during their expected lifetime, by taking into account not only the statistical nature of the phenomenon but also the underlying dynamics.
“With our approach, you can assess, from the preliminary design phase, how a structure will behave not to one wave but to the overall collection or family of waves that can hit this structure,” says Themistoklis Sapsis, associate professor of mechanical and ocean engineering at MIT. “You can better design your structure so that you don’t have structural problems or stresses that surpass a certain limit.”
Sapsis says that the technique is not limited to ships and ocean platforms, but can be applied to any complex system that is vulnerable to extreme events. For instance, the method may be used to identify the type of storms that can generate severe flooding in a city, and where that flooding may occur. It could also be used to estimate the types of electrical overloads that could cause blackouts, and where those blackouts would occur throughout a city’s power grid.
Sapsis and Mustafa Mohamad, a former graduate student in Sapsis’ group, currently assistant research scientist at Courant Institute of Mathematical Sciences at New York University, are publishing their results this week in the Proceedings of the National Academy of Sciences.
Bypassing a shortcut
Engineers typically gauge a structure’s endurance to extreme events by using computationally intensive simulations to model a structure’s response to, for instance, a wave coming from a particular direction, with a certain height, length, and speed. These simulations are highly complex, as they model not just the wave of interest but also its interaction with the structure. By simulating the entire “wave field” as a particular wave rolls in, engineers can then estimate how a structure might be rocked and pushed by a particular wave, and what resulting forces and stresses may cause damage.
These risk assessment simulations are incredibly precise and in an ideal situation might predict how a structure would react to every single possible wave type, whether extreme or not. But such precision would require engineers to simulate millions of waves, with different parameters such as height and length scale—a process that could take months to compute.
“That’s an insanely expensive problem,” Sapsis says. “To simulate one possible wave that can occur over 100 seconds, it takes a modern graphic processor unit, which is very fast, about 24 hours. We’re interested to understand what is the probability of an extreme event over 100 years.”
As a more practical shortcut, engineers use these simulators to run just a few scenarios, choosing to simulate several random wave types that they think might cause maximum damage. If a structural design survives these extreme, randomly generated waves, engineers assume the design will stand up against similar extreme events in the ocean.
But in choosing random waves to simulate, Sapsis says, engineers may miss other less obvious scenarios, such as combinations of medium-sized waves, or a wave with a certain slope that could develop into a damaging extreme event.
“What we have managed to do is to abandon this random sampling logic,” Sapsis says.
A fast learner
Instead of running millions of waves or even several randomly chosen waves through a computationally intensive simulation, Sapsis and Mohamad developed a machine-learning algorithm to first quickly identify the “most important” or “most informative” wave to run through such a simulation.
The algorithm is based on the idea that each wave has a certain probability of contributing to an extreme event on the structure. The probability itself has some uncertainty, or error, since it represents the effect of a complex dynamical system. Moreover, some waves are more certain to contribute to an extreme event over others.
The researchers designed the algorithm so that they can quickly feed in various types of waves and their physical properties, along with their known effects on a theoretical offshore platform. From the known waves that the researchers plug into the algorithm, it can essentially “learn” and make a rough estimate of how the platform will behave in response to any unknown wave. Through this machine-learning step, the algorithm learns how the offshore structure behaves over all possible waves. It then identifies a particular wave that maximally reduces the error of the probability for extreme events. This wave has a high probability of occuring and leads to an extreme event. In this way the algorithm goes beyond a purely statistical approach and takes into account the dynamical behavior of the system under consideration.
The researchers tested the algorithm on a theoretical scenario involving a simplified offshore platform subjected to incoming waves. The team started out by plugging four typical waves into the machine-learning algorithm, including the waves’ known effects on an offshore platform. From this, the algorithm quickly identified the dimensions of a new wave that has a high probability of occurring, and it maximally reduces the error for the probability of an extreme event.
The team then plugged this wave into a more computationally intensive, open-source simulation to model the response of a simplified offshore platform. They fed the results of this first simulation back into their algorithm to identify the next best wave to simulate, and repeated the entire process. In total, the group ran 16 simulations over several days to model a platform’s behavior under various extreme events. In comparison, the researchers carried out simulations using a more conventional method, in which they blindly simulated as many waves as possible, and were able to generate similar statistical results only after running thousands of scenarios over several months.
Sapsis says the results demonstrate that the team’s method quickly hones in on the waves that are most certain to be involved in an extreme event, and provides designers with more informed, realistic scenarios to simulate, in order to test the endurance of not just offshore platforms, but also power grids and flood-prone regions.
“This method paves the way to perform risk assessment, design, and optimization of complex systems based on extreme events statistics, which is something that has not been considered or done before without severe simplifications,” Sapsis says. “We’re now in a position where we can say, using ideas like this, you can understand and optimize your system, according to risk criteria to extreme events.”
This research was supported, in part, by the Office of Naval Research, Army Research Office, and Air Force Office of Scientific Research, and was initiated through a grant from the American Bureau of Shipping.