The global use of artificial intelligence is growing at lightning speed. Yet the energy consumption associated with it remains largely invisible. Research by data scientist Alex de Vries-Gao shows that AI is on its way to becoming one of the biggest energy consumers in the digital world. There is a lack of oversight and transparency.
AI applications such as chatbots, search engines and translation tools are increasingly being used by businesses and consumers. To make such systems possible, thousands of powerful chips run, providing huge amounts of computing power. These are needed to train AI models first, and then deploy them again and again. Especially the latter, the use of a model, also called inference, happens daily on a gigantic scale. According to De Vries-Gao, who works at the Institute for Environmental Studies (IVM), the total energy consumption of AI is therefore greatly underestimated.
In his research, which appeared in the scientific journal Joule, De Vries-Gao shows that there is a major lack of transparency. Large technology companies such as Google, Microsoft and OpenAI hardly provide any insight into the energy consumption of their AI systems. Google published as recently as 2022 that AI was responsible for 10 to 15 percent of total energy consumption, but no new figures have been shared since then. The European AI Act requires companies to report energy use, but only for the training phase of AI models. Actual usage, which actually accounts for most of the consumption, falls outside that obligation.
Lacking hard numbers, De Vries-Gao analyzed the amount of power consumed by these AI accelerator modules. Major chip manufacturers such as NVIDIA and AMD supplied more than five million of these graphics cards running AI in 2023 and 2024. Based on the global production capacity of these chips and knowledge of the modules used, he estimates the realistic power consumption of just the modules to be 3 to 5.2 gigawatts. For full AI systems using these modules, along with the cooling required for them in data centers, total power consumption could be as high as 9.4 gigawatts. By comparison, that's the entire power consumption of the Netherlands. With the expected doubling of production capacity by 2025, that consumption could even rise to 23 gigawatts, which would make AI one of the largest energy consumers within digital infrastructure worldwide.
Without better rules or mandatory reporting, it remains virtually impossible to get a good idea of AI's actual energy consumption. According to De Vries-Gao, the rapid growth of this technology clashes with other societal ambitions, such as achieving climate goals and reducing overall energy consumption. He therefore calls for more openness so that governments can make effective policies that align the development of AI with sustainability. Happens
do not, then AI threatens to develop invisibly into an uncontrolled source of energy consumption and carbon emissions.
Read the entire paper here.