The Energy Consumption Concerns of Artificial Intelligence

The Energy Consumption Concerns of Artificial Intelligence

The rise of artificial intelligence (AI) has sparked numerous discussions about its potential drawbacks. From concerns about cheating to job displacement and even the potential annihilation of humanity, AI has certainly attracted its fair share of controversy. However, a new concern has emerged in the form of energy consumption. As AI becomes more capable and its applications expand, experts are warning that the energy required to power the necessary algorithms and machine learning processes could contribute to climate change.

Alex de Vries, a PhD candidate at Vrije Universiteit Amsterdam, has shed light on this issue. In a commentary published in the journal Joule, de Vries suggests that if every Google search for a year utilized AI, it would consume the same amount of electricity as a small country like Ireland. De Vries, who is also the founder of Digiconomist, a website that sheds light on the unintended consequences of digital trends, warns that as the demand for AI services grows, so too will the energy consumption associated with it.

Generative AI, which is being increasingly used by the general public, relies on vast amounts of data and machine learning processes. For example, models like OpenAI’s ChatGPT and Midjourney’s image creation tool have become immensely popular. However, de Vries highlights that the AI company Hugging Face has disclosed that its multilingual text-generation AI consumed a staggering 433 megawatt-hours during training, enough to power 40 average homes in the US for a year. Furthermore, de Vries points out that even when these AI tools are used for outputs like text generation, they still require significant amounts of computing power and energy. For instance, ChatGPT alone could consume 564 megawatt-hours of electricity per day to run.

While developers are working to improve the efficiency of AI tools, de Vries cautions that this can lead to a phenomenon known as Jevons' Paradox, where increased efficiency simply leads to more usage. Therefore, despite efforts to make AI more energy-efficient, the growing demand for the technology could result in even greater overall energy consumption.

Based on available data, de Vries estimates that if Google were to use AI for its approximately 9 billion daily searches, it would require 29.2 terrawatt-hours of power per year, equivalent to Ireland’s annual electricity consumption. Although this scenario may not occur in the near future, the capacity to process the demand for AI is expected to increase. By 2027, de Vries projects that worldwide AI-related electricity consumption could rise by 85 to 134 terrawatt-hours annually, similar to the electricity needs of countries like the Netherlands, Argentina, and Sweden.

De Vries’s previous research has also examined the energy costs associated with other buzz-worthy technologies, such as cryptocurrencies. The massive energy requirements for cryptocurrency transactions, particularly Bitcoin, have raised concerns about their carbon footprint, with some estimates suggesting that Bitcoin’s emissions could match that of an entire nation like New Zealand.

The potential impact of AI on energy consumption is a reminder that as the technology continues to advance, society needs to be mindful of how and where it is employed. While the benefits of AI are numerous, its energy-intensive nature necessitates careful consideration of its applications. By understanding the energy demands and working towards more efficient usage, we can harness the power of AI while minimizing its impact on the climate.


Written By

Jiri Bílek

In the vast realm of AI and U.N. directives, Jiri crafts tales that bridge tech divides. With every word, he champions a world where machines serve all, harmoniously.