The artificial intelligence (AI) boom of 2023 has raised concerns about the electricity consumption of AI and data centres.
AI has undergone a period of rapid expansion since OpenAI’s ChatGPT burst onto the scene nearly a year ago, spurring the likes of Microsoft and Alphabet to introduce their own chatbots in 2023.
But a recent study has warned that Google’s AI alone could consume as much as 29.3 terawatt-hours (TWh) per year, the equivalent electricity consumed by Ireland.
An analysis published in Joule in October says this is a worst-case scenario that assumes full-scale AI adoption using current hardware and software, which is unlikely to happen quickly.
Estimating the energy consumption of AI
The analysis is authored by Alex de Vries, founder of Digiconomist, a research company dedicated to exposing the unintended consequences of digital trends. His article explores initial research on AI electricity consumption and assesses the potential implications of widespread AI adoption on global data centre electricity use.
Currently, the International Energy Agency estimates data centre electricity consumption represents one to 1.5 percent of worldwide electricity consumption. But, according to de Vries’ analysis, AI enthusiasm has put its server supply chain on track for a more significant contribution to worldwide data centre electricity consumption in the coming years. He discusses both pessimistic and optimistic scenarios.
Careful steps forward
De Vries is not the first to draw public focus to the environmental sustainability of the frenzied development of AI. Among others, AI researcher Kate Saenko similarly explored the energy costs of building generative AI models for The Conversation in May.
Both Saenko and de Vries agree that the future of AI-related electricity consumption is hard to predict.
“While a single large AI model is not going to ruin the environment, if a thousand companies develop slightly different AI bots for different purposes, each used by millions of customers, the energy use could become an issue,” says Saenko.
“More research is needed to make generative AI more efficient. The good news is that AI can run on renewable energy. By bringing the computation to where green energy is more abundant, or scheduling computation for times of day when renewable energy is more available, emissions can be reduced by a factor of 30 to 40, compared to using a grid dominated by fossil fuels.”
De Vries cautions against either overly optimistic and overly pessimistic expectations.
“Integrating AI into applications such as Google Search can significantly boost the electricity consumption of these applications,” he says.
“However, various resource factors are likely to restrain the growth of global AI-related electricity consumption in the near term. Simultaneously, it is probably too optimistic to expect that improvements in hardware and software efficiencies will fully offset any long-term changes in AI-related electricity consumption. These advancements can trigger a rebound effect whereby increasing efficiency leads to increased demand for AI, escalating rather than reducing total resource use.”
De Vries concludes with another cautionary note for those at the helm of this technology.
“It would be advisable for developers not only to focus on optimising AI, but also to critically consider the necessity of using AI in the first place, as it is unlikely that all applications will benefit from AI or that the benefits will always outweigh the costs.”