AI systems are projected to account for nearly half of data centre power consumption by the end of this year, according to new research by Alex de Vries-Gao, the founder of Digiconomist. This analysis, published in the academic journal Joule, highlights the escalating energy demands of AI technologies, which could significantly impact data centre operations globally.

De Vries-Gao’s research draws attention to the power consumed by chips from companies such as Nvidia and Advanced Micro Devices (AMD), essential for training and running AI models. The study also considers chips from other manufacturers such as Broadcom.

This comes after the International Energy Agency (IEA) predicted that AI could require energy levels comparable to Japan’s current consumption by the decade’s end. Recently, Amazon Web Services (AWS) CEO Matt Garman also stated that the UK needs more nuclear energy to support the data centres crucial for AI.

Last year, data centres, excluding those used for cryptocurrency mining, consumed approximately 415 terawatt hours (TWh) of electricity, according to the IEA. De Vries-Gao estimates that AI systems already account for about 20% of this consumption. By 2025, AI’s share could reach up to 49%, equating to 23GW, nearly double the energy consumption of the Netherlands.

Critical factors in AI’s energy footprint

Several factors influence these calculations, including data centre energy efficiency and the electricity needed for cooling systems managing AI workloads. As data centres serve as the backbone of AI technology, their high energy requirements pose sustainability challenges for AI development and usage.

Potential factors could temper the rising demand for AI hardware. De Vries-Gao notes that diminishing interest in applications such as ChatGPT and geopolitical issues could slow demand. Despite these possible limitations, efficiency improvements might spur further AI adoption. The trend toward ‘sovereign AI’, where countries develop their own AI systems, could also boost hardware demand.

Tech giants Microsoft and Google have acknowledged that their AI initiatives risk compromising their environmental goals. De Vries-Gao emphasises the lack of transparency regarding AI power demands, describing the sector as an ‘opaque industry’. Although the EU AI Act mandates disclosure of energy use for AI model training, it doesn’t extend to everyday operations.

Professor Adam Sobey, mission director for sustainability at the UK’s Alan Turing Institute, told The Guardian that there is a need for greater transparency in AI energy consumption. He suggested that AI’s potential to enhance efficiency in carbon-intensive industries like transport and energy could offset its energy use. “I suspect that we don’t need many very good use cases [of AI] to offset the energy being used on the front end,” Sobey told the publication.

Read more: AI to propel global data centre electricity demand by 2030