When Microsoft's CEO said, "Insufficient power may lead to chip accumulation," neither he nor Altman knew how much electricity AI actually requires

Wallstreetcn
2025.11.04 03:29
portai
I'm PortAI, I can summarize articles.

Artificial intelligence giants are facing a new bottleneck: electricity. Microsoft CEO Satya Nadella revealed that the chips the company has purchased are idle due to insufficient power and data centers. Altman pointed out that tech companies are in a dilemma: if they lock in long-term power contracts now, they may suffer losses in the future due to breakthroughs in new energy technologies; if they underinvest, they may not be able to meet the explosive growth in AI demand

The focus of the artificial intelligence competition is shifting from computing power to electricity. Tech industry leaders acknowledge that they are grappling with a fundamental uncertainty: how much energy will future AI actually consume?

Microsoft CEO Satya Nadella recently revealed during the "BG2" podcast that the biggest issue currently limiting the company's growth is no longer chip shortages. Nadella stated, "The biggest problem we face now is not an excess of computing power, but electricity... and whether we can build data centers close to power sources quickly enough."

Nadella candidly pointed out that this disconnect has led to a chip backlog at Microsoft. "You might have a bunch of chips sitting idle in inventory because I can't plug them into power. In fact, that's the problem I'm facing today." He added that the issue is not with chip supply, but with the lack of "warm shells" data centers that can be occupied and powered up at any time. This statement clearly reveals that the pace of infrastructure construction in the physical world has lagged far behind the expansion of computing power in the digital world.

OpenAI CEO Sam Altman, who participated in the podcast alongside Nadella, also emphasized the strategic dilemma brought about by this uncertainty. He believes the entire industry is in a massive energy gamble, with no one knowing the outcome.

Bottleneck Shift: From Chips to Electricity

For a long time, the market has generally believed that acquiring advanced graphics processing units (GPUs) is the biggest obstacle to deploying AI services. However, Nadella's remarks confirm that the bottleneck has shifted. When the chips that tech companies spend huge sums on cannot be powered on, the advantage of computing power becomes irrelevant.

This phenomenon reflects the challenges faced by software and chip companies, which are accustomed to rapid iteration, when confronted with heavy asset, long-cycle industries like energy and real estate. In the United States, the electricity demand of data centers has surged sharply over the past five years, breaking a decade-long period of stability, with its growth rate exceeding the generation capacity planning of utility companies.

This has forced data center developers to seek "behind-the-meter" power solutions, bypassing the public grid to obtain electricity directly from generation facilities.

Demand Fog: How Much Energy Does AI Need?

"How much electricity is enough? No one knows, not even Sam Altman or Satya Nadella," TechCrunch pointed out in a report on November 3. This uncertainty stems from the rapid evolution of AI technology itself.

Altman proposed a "very scary exponential" growth prospect during the podcast. He hypothesized that if the decline in the cost of intelligent units could maintain a rate of 40 times per year, the resulting demand growth from an infrastructure perspective would be astonishing.

He firmly believes that the "Jevons paradox" will play out in the AI field: that is, improvements in computing efficiency and reductions in costs will actually stimulate usage to grow by more than a hundredfold, as more applications that are not economically viable at current costs become feasible

Energy Gamble: The Dilemma of Betting on the Future

It is this enormous uncertainty that puts industry leaders like Altman in a difficult position regarding energy strategies. He describes a dilemma: "If a very cheap form of energy is quickly adopted on a large scale, then many who have signed existing (expensive) power contracts will suffer heavy losses."

On the other hand, if bold investments are not made, there is a risk of missing out on the explosive demand for AI. Altman acknowledges that if the efficiency gains from AI exceed expectations, or if demand growth falls short of expectations, some companies may bear the heavy burden of idle power plants.

To hedge against risks and explore the future, Altman himself has invested in several energy startups, including the nuclear fission company Oklo, the nuclear fusion company Helion, and a solar thermal storage company Exowatt.

Response Strategy: Finding a Way Out Between Tradition and Innovation

In the face of challenges, technology companies are actively seeking solutions. The construction cycle for traditional natural gas power plants can take years, which does not match the speed of demand in the AI industry. Therefore, deploying faster, cheaper, and zero-emission solar energy has become a popular choice.

Solar photovoltaic technology shares many similarities with the semiconductor industry: both are based on silicon-based materials, produced in modular components that can be quickly assembled into arrays to increase power. This modularity and rapid deployment characteristic makes its construction pace closer to that of data centers. However, whether building data centers or solar projects, time is required, and the speed of market demand changes far exceeds this.

This puts technology companies in a constant race against time in the interconnected fields of computing power, data centers, and electricity, continuously facing the test of strategic decision-making