ACADEMIA
Energy consumption of AI could be equivalent to that of a small country
AI has the potential to have a large energy footprint in the future, potentially exceeding the power demands of some countries. Improvements in AI efficiency can lead to increased demand and the Jevons Paradox. Google processes up to 9 billion searches a day and if every search used AI it would need about 29.2 TWh of power a year, the same as Ireland's annual electricity consumption.
Artificial intelligence (AI) is believed to help coders code faster, make daily tasks less time-consuming, and improve driving safety. However, a recent commentary published in the Joule journal by the founder of Digiconomist suggests that the tool's massive adoption could result in a large energy footprint. In the future, this energy demand could exceed that of some countries.
The author of the commentary, Alex de Vries, a Ph.D. candidate at Vrije Universiteit Amsterdam, states "Looking at the growing demand for AI services, it’s very likely that energy consumption related to AI will significantly increase in the coming years."
Generative AI, which produces text, images, or other data, has undergone rapid growth since 2022, including OpenAI’s ChatGPT. To train these AI tools, large amounts of data are required, which is an energy-intensive process. Hugging Face, an AI-developing company based in New York reported that its multilingual text-generating AI tool consumed about 433 megawatt-hours (MWH) during training, which is enough to power 40 average American homes for a year.
Furthermore, AI's energy demand does not end with training. De Vries's analysis shows that when the tool generates data based on prompts, every text or image it produces uses a significant amount of computing power and energy. For example, ChatGPT may consume 564 MWh of electricity every day.
As companies worldwide are striving to improve the efficiency of AI hardware and software to make the tool less energy-intensive, an increase in the machines’ efficiency often leads to an increase in demand. According to de Vries, technological advancements will lead to a net increase in resource use, a phenomenon known as Jevons’ Paradox.
Making these tools more efficient and accessible can result in allowing more applications it and more people to use them, de Vries says. For instance, Google has been incorporating generative AI in its email service and testing out powering its search engine with AI. Currently, the company processes up to 9 billion searches a day. Based on the data, de Vries estimates that if every Google search uses AI, it would require approximately 29.2 TWh of power per year, equivalent to the annual electricity consumption of Ireland.
However, de Vries notes that this extreme scenario is unlikely to occur in the short term due to the high costs associated with additional AI servers and bottlenecks in the AI server supply chain. Nevertheless, AI server production is projected to grow rapidly soon. By 2027, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually based on the projection of AI server production, according to de Vries.
This amount is comparable to the annual electricity consumption of countries such as the Netherlands, Argentina, and Sweden. Furthermore, improvements in AI efficiency could also enable developers to repurpose some computer processing chips for AI use, which could further increase AI-related electricity consumption.
“The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy-intensive, so we don't want to put it in all kinds of things where we don’t actually need it,” de Vries says.