If you ask ChatGPT to explain the impact AI is having on the electric grid, it cheerfully (and without a hint of irony) explains all the ways AI consumes vast amounts of power. Training models? Yup, that takes a few electrons. Answering questions – such as how much electricity is used to answer a question? Add even more electrons to the tab.
It’s been well publicized that tech companies are investing huge sums of money in hardware to support their burgeoning AI efforts. They are buying every Graphics Processing Unit (GPU) they can get their hands on, as GPUs excel in the parallel computations necessary for modern AI. Those GPUs are housed in massive data centers, where they devour electriciy, generating endless heat in the process. Dealing with that heat requires robust air conditioning, which only increases the electricity bill further.
And, of course, the human engineers programming these system probably consume a lot of coffee to stay energized, and that takes some energy too.
In February, 2024, The Verge published a story pondering the electricity consumption needs of modern AI systems such as Large Language Models (LLMs). Although the closed nature of many AI systems makes it difficult to estimate their exact energy use (the companies that make them aren’t volunteering any numbers), author James Vincent tried to put forth some estimates on both the training and query-answering sides.
Training, in particular, is extremely energy intensive, consuming much more electricity than traditional data center activities. Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes. To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.
Once a model has been trained, it takes much less power to run a query against it, although at scale, that quickly adds up.
Vincent references a study performed by researchers Alexandra Sasha Luccioni, Yacine Jernite, and Emma Strubell that examined the possible energy requirements of common AI tasks such as answering questions or generating images. Vincent writes:
Luccioni and her colleagues ran tests on 88 different models spanning a range of use cases, from answering questions to identifying objects and generating images. In each case, they ran the task 1,000 times and estimated the energy cost. Most tasks they tested use a small amount of energy, like 0.002 kWh to classify written samples and 0.047 kWh to generate text. If we use our hour of Netflix streaming as a comparison, these are equivalent to the energy consumed watching nine seconds or 3.5 minutes, respectively. (Remember: that’s the cost to perform each task 1,000 times.) The figures were notably larger for image-generation models, which used on average 2.907 kWh per 1,000 inferences. As the paper notes, the average smartphone uses 0.012 kWh to charge — so generating one image using AI can use almost as much energy as charging your smartphone.
Vincent cautions that there still can be significant variance in these estimates based on the actual implementation of AI systems.
But the trendlines are clear: the amount of electricity consumed by AI is going to grow significantly in the coming years. According to the Boston Consulting Group, which provides AI-related consulting services, AI data centers could consume up to 7.5% of the total electric output by 2030, tripling their energy usage from just a couple years ago.
That’s going to put a great deal of stress on electric generation, the electric grid, and the environment. The Editors of Bloomberg Businessweek recently acknowledged the problem in an opinion piece:
It’s easy to envision how things could go wrong. In the US, power-hungry AI applications are already adding to strains on electricity grids and pushing utilities to burn more fossil fuels. In Ireland, a global computing hub, data centers are expected to consume nearly a third of all electricity by 2032. Cue a vignette of people unwittingly boiling the oceans in pursuit of the perfect dog portrait.
But their prognosis isn’t all doom and gloom; they see tech companies as possibly being in the best position to address the problem:
There’s also a more positive scenario. The users and owners of these data centers — including Alphabet Inc., Amazon.com Inc., Meta Platforms Inc. and Microsoft Corp. — are among the world’s largest companies, with ample cash, long strategic horizons and public commitments to the environment. Who better to drive some of the tens of trillions of dollars in investment required to build clean generation, enhance power grids and achieve net-zero carbon emissions?
There could also be breakthroughs in the amount of processing computers can perform per watt of power; indeed, there have been significant advancements in the energy efficiency of modern microprocessors. And more efficient algorithms could yield the same results with decreased power demands.
But at least in the near term, it looks likely that AI-related processing is going to be placing increasing amounts of stress on the electric grid.