The implications for power infrastructure are profound. While modern data centers employ advanced cooling systems and energy-efficient hardware, the sheer scale of AI computation presents unprecedented challenges. Bronis R. de Supinski, Chief Technology Officer (CTO) for Livermore Computing at Lawrence Livermore National Laboratory (LLNL) and ACM Fellow, emphasizes that traditional efficiency metrics like GFlops/Watt fail to capture the complete environmental impact of these systems. In the following interview, de Supinski outlines key strategies to measure and manage AI’s environmental footprint, an ever-more pressing concern for data centers and HPC facilities worldwide.
What are your thoughts on how we can balance the drive for larger AI models against environmental concerns?
Bronis R. de Supinski: While energy efficiency to help address ongoing environmental concerns has improved with advances in clock gating and Dynamic Voltage and Frequency Scaling (DVFS), increased energy efficiency allows us to run more and tackle bigger, more complex problems, which usually increases overall energy use. The energy source is the real key to reducing the environmental impact of computing. Shifting to renewable or low-carbon energy can significantly lower the footprint, no matter the scale of demand.
A more comprehensive approach would include the carbon footprint of energy sources and the lifecycle impact of hardware manufacturing and disposal. This shift in metrics would allow us to address sustainability more holistically while still enabling growth in AI and other computing capabilities.
How do you envision the relationship between model size and energy efficiency evolving?
de Supinski: As AI models continue to grow, we are seeing increases in energy use. However, this comes with notable improvements in energy efficiency and computational speed. These advancements mean that while systems can now solve problems faster, they are also able to tackle larger, more complex problems, which drives up overall energy consumption. To wrap it in a bow, more computing capability equals more problems, resulting in more energy use.
When considering the environmental impact of this trend, it’s important to note that energy efficiency alone is not the primary factor. The source of the energy we use matters more. Transitioning to renewable and low-carbon energy sources is crucial if we want to mitigate the environmental effects of these growing computational demands.
As model size continues to grow, we must consider the practical limits, theoretical implications, and environmental implications of this scaling. Balancing innovation with sustainability will be key as we move forward.
Following the breakthrough fusion ignition achievement at Lawrence Livermore in 2022, how do you see the computational needs for fusion research affecting overall energy consumption in scientific computing?
About Bronis R. de Supinski
Bronis R. de Supinski is chief technology officer (CTO) for Livermore Computing (LC) at Lawrence Livermore National Laboratory (LLNL). In this role, he formulates LLNL’s large-scale computing strategy and oversees its implementation. He frequently interacts with supercomputing leaders and oversees many collaborations with industry and academia.
Previously, Bronis led several research projects in LLNL’s Center for Applied Scientific Computing. He earned his Ph.D. in Computer Science from the University of Virginia in 1998 and he joined LLNL in July 1998.
In addition to his work with LLNL, Bronis is a Professor of Exascale Computing at Queen’s University of Belfast. Throughout his career, Bronis has won several awards, including the prestigious Gordon Bell Prize in 2005 and 2006, as well as two R&D 100s. He is a Fellow of the ACM and the IEEE. He has held leadership roles in most major ACM HPC conferences including serving as the SC21 General Chair.
Bronis is also a ACM Fellow and was a presenter at SC’24.
de Supinski: Fusion energy has shown a promising path toward a sustainable and powerful energy source for the future, and the breakthrough ignition achievement at the National Ignition Facility (NIF) in 2022 was a critical milestone. Making progress in controlled fusion is an incredible scientific achievement, but it also highlights how important computing power is in making it happen.
The breakthrough at NIF relied on advanced computer modeling to solve challenges like capsule design and creating the exact conditions needed for ignition. Developing commercial fusion energy will take even more computing power and likely 20 years or more to become a reality. While this will increase energy use in research, the ultimate goal — to make fusion a clean, net-positive energy source — makes the effort well worth it. As computational power continues to advance, ensuring that it is powered by renewable energy will be crucial to achieving fusion energy’s promise sustainably.
What concrete steps should organizations take in assessing an AI deployment’s benefits against its environmental costs?
de Supinski: Organizations must carefully weigh the benefits of using AI against its environmental impact. This means looking at how energy-efficient their systems are and using tools to save energy when adjusting to workload demands. It’s also important to use clean energy sources, like wind or solar, to power these systems. It’s not just about how much energy the AI uses but also about the impact of building and disposing of the hardware. Smarter AI systems can help reduce waste by working more efficiently, and organizations should measure both the immediate energy use and the long-term environmental effects. The goal is to ensure the positive outcomes of using AI outweigh its environmental costs.
Could you share examples of successful AI-driven energy optimizations you’ve observed at Lawrence Livermore or elsewhere?
de Supinski: Energy optimizations have primarily been driven by hardware advances. Historically, large-scale systems have been used for modeling and simulation. The Blue Gene/L system was one of the first systems for which energy efficiency was a primary design goal, and it was the most energy-efficient system in the world when it was built. Its advances included massive parallelism and relatively low chip frequencies. Today, systems like El Capitan are even more powerful and efficient; the GPUs at the heart of these systems reflect those lessons but also demonstrate that continued hardware and software improvements can yield significant benefits.
AI can help guide those improvements. For example, a frequent technique we use is DVFS, which adjusts a processor’s speed to match the workload, saving energy without sacrificing performance. The goal is to work as efficiently as possible while avoiding wasted energy. AI models can help make faster, smarter decisions about resource use, and as technology advances, we can expect even more progress in making computing systems sustainable.
As we approach 2025, what key indicators should we monitor to assess progress in environmentally conscious computing?
de Supinski: As we move toward 2025, a few key indicators can help measure progress in environmentally conscious computing. One is the adoption of energy-efficient technologies, like systems that significantly adjust power use based on workload demands. Another is the shift toward renewable energy sources powering large-scale computing. Additionally, tracking the development of smaller, more specialized AI models that perform as well as larger ones is crucial. These advances can reduce energy consumption while maintaining effectiveness. Together, these trends highlight the importance of smarter hardware, software, and energy choices in shaping a more sustainable future for computing.
Could you elaborate on specific metrics or frameworks you recommend for measuring lifecycle energy consumption of AI systems?
de Supinski: A comprehensive framework should include the carbon emissions associated with hardware manufacturing, transport, operation, and disposal. The energy source used during operation is equally critical, as renewable energy can dramatically lower the system’s lifecycle carbon footprint.
Hardware advances, such as energy-efficient processors, must be complemented by software optimizations like intelligent resource management. This dual focus ensures that computing systems are optimized for both performance and sustainability. By adopting such holistic metrics, the industry can better align technological advancements with environmental responsibility.
Tell Us What You Think!