AI's Hunger for Power Can Be Tamed

The artificial intelligence hype that made Nvidia Corp. the world’s biggest company has come with a price for the world’s climate. Data centers housing its powerful chips are gorging power and belching carbon dioxide, and sobering figures now reveal the extent of the problem. Data centers will use 8% of US power by 2030, compared with 3% in 2022, according to a recent report from Goldman Sachs Group Inc., as their energy demand grows by 160%.

AI is currently doing more to worsen the climate emergency than solve it, as some AI firms have touted.1 So great are the energy needs that utilities are extending their plans for coal plants, while Microsoft Corp. is building gas and nuclear facilities to keep its servers humming.

Add this all to the growing discontent about generative AI tools. To not only stem the tide but also uphold their goals of building AI “for humanity,” tech firms like OpenAI, Microsoft and Alphabet Inc.’s Google must grow their teams addressing the power issue. It would certainly be possible. A few signs of progress suggest the trick may be to redesign their algorithms.

Generative AI models like ChatGPT and Anthropic’s Claude are impressive, but their neural network architectures demand vast amounts of energy, and their indecipherable “black box” decision-making processes makes them difficult to optimize. The current state of AI is like trying to power a small car with a huge gas-guzzling engine: It gets the job done but at enormous cost.2