
[Adobe Stock]
OpenAI CEO Sam Altman warned this week that the U.S. is misjudging the speed and breadth of China’s AI push, adding that tightening chip export controls alone won’t slow Beijing. He spoke with reporters in San Francisco. An on-the-record dinner on Aug. 14 was first; remarks reported Aug. 18–19 followed. Altman said “I’m worried about China,” he said in one of the appearances. He argued that “my instinct is that [export controls] don’t work.”
Altman also linked the rise of Chinese open-source (open-weight) models to OpenAI’s own decision to release weights for its GPT-oss models this month. “If we didn’t do it, the world was gonna be mostly built on Chinese open source models,” he told reporters. Altman pointed to systems like DeepSeek and Kimi. That competitive backdrop aligns with outside reporting that framed OpenAI’s open-weight move as a direct challenge to China’s lead in open models.
Power, not just GPUs, as the deciding edge
The timing of Altman’s comments dovetails with a Fortune report describing how China’s electricity abundance, after decades of heavy investment in generation, transmission and renewables, lets it scale AI infrastructure without the bottlenecks emerging in the U.S. AI specialists who toured China told Fortune that energy availability is treated as a given; the country adds more electricity demand each year than Germany’s entire annual consumption and keeps reserve margins around 80%–100%, with idle coal plants ready as backstops.
The U.S., by contrast, is running into hard infrastructure limits just as AI-related load spikes. A Department of Energy–backed Lawrence Berkeley National Laboratory (LBNL) study projects U.S. data-center power use could double or even triple by 2028, taking data centers’ share of total U.S. electricity from about 4.4% in 2023 to 6.7%–12% by 2028. That demand growth is already straining interconnection queues and transmission, shifting where and how fast new AI clusters can be built.
Those bottlenecks are structural, not cyclical. The latest LBNL “Queued Up” analysis (April 2024 edition, analyzing data through 2023) shows the typical project built in 2023 took nearly five years to move from interconnection request to commercial operation, up from roughly three years in 2015. Completion rates remain low and backlog pressure is nationwide, according to Berkeley Lab.
Workarounds: When the grid can’t keep up
Some tech firms are responding by securing dedicated, firm power, often nuclear. In June, Reuters reported a 20-year deal between Microsoft and Constellation that is helping restart Three Mile Island Unit 1 as early as 2027 to serve data-center load. Two months later, Google and Kairos Power announced plans for a 50-MW advanced reactor with TVA in Oak Ridge, Tennessee, targeting 2030. Amazon Web Services separately inked a contract to draw up to 1,920 MW from Talen Energy’s Susquehanna nuclear plant for Pennsylvania data centers (announced June 11, 2025).
Altman didn’t sugarcoat the financial climate. Asked whether investors are collectively overhyping AI, he answered, “Yes,” likening today to the late-’90s dot-com era “where smart people get overexcited about a kernel of truth.” At the same time, he called AI “the most important thing to happen in a very long time,” and said OpenAI expects to spend trillions of dollars on data-center construction “in the not very distant future.” That ambition intersects directly with U.S. energy constraints, and with China’s head start on available power.
The price tag keeps climbing
Independent analyses suggest the infrastructure bill will be staggering regardless of who leads. McKinsey estimates the world will need about $6.7 trillion in data-center investment by 2030, roughly 70% tied to AI workloads (often summarized since late July/August 2025 as “almost $7 trillion”). If U.S. permitting and transmission continue to lag, those dollars will flow to geographies that can connect new load fastest—one reason Altman’s warnings about underestimating China resonated across the R&D community this week.
China’s edge lies in its abundant, low-cost power, which lets training and inference expand with minimal friction despite chip curbs, per experts cited in recent reports. That reality amps up the urgency for U.S. policymakers to modernize the grid fast.
Bottom line: Chips remain key, but in 2025, the tightest bottleneck for AI may be electrons. Altman’s warnings boil down to this—the race goes to whoever has power ready and regulations that don’t drag.



