The artificial intelligence hype that made Nvidia Corp. the world’s biggest company has come with a price for the world’s climate. Data centers housing its powerful chips are gorging power and belching carbon dioxide, and sobering figures now reveal the extent of the problem. Data centers will use 8% of US power by 2030, compared with 3% in 2022, according to a recent report from Goldman Sachs Group Inc., as their energy demand grows by 160%.
AI is currently doing more to worsen the climate emergency than solve it, as some AI firms have touted.1 So great are the energy needs that utilities are extending their plans for coal plants, while Microsoft Corp. is building gas and nuclear facilities to keep its servers humming.
Add this all to the growing discontent about generative AI tools. To not only stem the tide but also uphold their goals of building AI “for humanity,” tech firms like OpenAI, Microsoft and Alphabet Inc.’s Google must grow their teams addressing the power issue. It would certainly be possible. A few signs of progress suggest the trick may be to redesign their algorithms.
Generative AI models like ChatGPT and Anthropic’s Claude are impressive, but their neural network architectures demand vast amounts of energy, and their indecipherable “black box” decision-making processes makes them difficult to optimize. The current state of AI is like trying to power a small car with a huge gas-guzzling engine: It gets the job done but at enormous cost.2
The good news is that these “engines” could get smaller with greater investment. Researchers at Microsoft, for instance, have developed a so-called “1 bit” architecture that can make large language models about 10 times as more energy efficient than the current leading systems. This approach simplifies the models’ calculations by reducing values to 0 or 1, slashing power consumption without sacrificing too much performance. The resulting tech isn’t the most capable, but a good example of a “contrarian” approach that can immediately reduce AI’s cost and environmental impact, says Steven Marsh, founder of Cambridge, UK-based startup Zetlin Ltd., which is working on building more efficient systems.
Marsh says he’s making progress. His team recently trained a neural network-based AI model on an Nvidia graphics processing unit (GPU), and the system heated up so much that they had to bring fans into the room over five days. When they ran the same model with their proprietary, non-neural network technology, it used just 60% of the power. The current approach, Marsh says, is “like putting a rocket engine on a bicycle."
Nvidia has also taken promising steps toward addressing the energy problem. A couple of years ago, it developed a new format for its chips to process AI calculations with smaller numbers, making them faster and less power hungry. “Just that little tweak on the silicon saved a lot of energy,” Marsh says. If companies designing AI systems take better advantage of that tweak, they could save energy eventually.
It doesn’t help that AI companies are in an arms race. OpenAI and Anthropic have raised $11.3 billion and $8.4 billion, respectively, according to data provider PitchBook. Much of that money isn’t going to recruitment (they each have workforces of just a few hundred people). Instead, it’s being poured into running servers that can train and run their models, even as their investment leads to diminishing returns. (There is evidence that the latest text- and vision-reading systems are showing smaller advancements in areas like accuracy and capability.)
Those companies, along with Google, Microsoft and Amazon.com Inc., should devote additional money to refashioning their algorithms to save energy and cost. Collectively, it’s been done before. Data centers managed to keep their power demands flat between 2015 and 2019, even as their workloads tripled, because their operators found ways to make them more efficient, according to Goldman Sachs.
OpenAI Chief Executive Officer Sam Altman has talked up nuclear fusion as an answer to the problem, having personally invested $375 million into an enterprise called Helion Energy. But he might be creating hype around an energy technology that won’t be commercialized for several decades.
Rather than outsource responsibility to a futuristic energy source or superintelligent AI that doesn’t exist yet, tech firms should put greater focus on making their models more energy efficient now. After all, breaking away from established and inefficient systems was how this revolution was kicked off in the first place.
1 Google DeepMind’s tagline used to be “Solve intelligence, then solve everything else.” Its founder Demis Hassabis would often point to climate change as being one of the key issues that artificial general intelligence (AGI) could solve.
2 Many developers building AI systems today use off-the-shelf, open-source systems that are hard coded in how they interact with an Nvidia chip.
A message from Advisor Perspectives and VettaFi: Join us on June 27th for the Midyear Market Outlook Symposium, where advisors will gain insights into macroeconomic trends, active ETFs, investing abroad, and more to navigate 2024's challenges.
Bloomberg News provided this article. For more articles like this please visit
bloomberg.com.
Read more articles by Parmy Olson