Tools like ChatGPT, apps that use AI such as Photoshop, and image generation platforms like Midjourney are well-suited for certain workflows. However, even the free versions can be quite costly due to the significant computing power required to process users’ requests. In turn, this leads to a high demand for energy and water to cool data centers.
In fact, it’s estimated that the use of AI will increase global energy consumption, so much so that companies like Google and Meta are already considering using nuclear energy to meet their needs. However, the International Energy Agency (IEA) has pointed out that we might be overestimating the energy consumption associated with AI. The real issue is global warming.
Exceeding the consumption of 100 countries. AI has been part of our lives for a long time, but its accessibility and demand have surged in recent years. While some algorithms have accelerated processes for some time, the rise of generative AI has dramatically increased energy consumption.
Major players like Microsoft and Google reported energy consumption of 24 TWh each in 2023. To put this into perspective, their combined energy consumption exceeds that of 100 different countries. In fact, both companies consume more energy individually than nations such as Libya and Azerbaijan.
What does the IEA say? The IEA has consistently expressed concerns about the high energy consumption associated with AI and data centers. The agency predicts that this demand will increase significantly in the short term due to the growing reliance on AI-based systems. However, it also suggests that these projections may lead to an overstated expectation of energy demand.
In its latest World Energy Outlook report, the IEA highlights that while investments in artificial intelligence are rising, hardware is becoming more efficient, allowing for the completion of more tasks with less energy. As a result, the energy demand from data centers is expected to be significantly lower compared to other industries.
Air conditioning. IEA data indicates that AI data centers will require about 202.8 TWh of energy by 2030. This figure is comparable to the energy needed for desalination systems, which are also energy-intensive as they produce drinking water. In contrast, the energy demand from data centers will be much less than that needed for other industries, such as air conditioning and electric vehicles.
Specifically, the projected increase in energy demand for data centers is only 3%, which is one-third of the demand expected for cooling systems by 2030. In contrast, the estimate for air conditioning consumption is around 676 TWh. Additionally, data center energy consumption will be less than the anticipated 473.2 TWh required for heating spaces during cold months.
The IEA says in its report:
“At a global level, data centres account for a relatively small share of overall electricity demand growth to 2030. More frequent and intense heatwaves than we assume in the STEPS, or higher performance standards applied to new appliances–notably air conditioners–both produce significantly greater variations in projected electricity demand than an upside case for data centres. The combination of rising incomes and increasing global temperatures generate more than 1 200 TWh of extra global demand for cooling by 2035 in the STEPS, an amount greater than the entire Middle East’s electricity use today.”
AI energy consumption remains high. While the AI boom may not lead to a catastrophic increase in global energy demand, it’s still expected to be significantly elevated. In response, the IEA has organized a global summit to address the challenges posed by rising AI energy consumption. The event is scheduled for Dec. 5 in Paris and will gather key players from the sector.
The issue extends beyond just energy consumption. In addition to AI’s substantial resource use, it also generates a considerable amount of electronic waste. Spending on AI increased eightfold from 2022 to 2023, with a significant portion dedicated to constructing and equipping data centers.
Moreover, the quantity of computing systems isn’t the only critical factor. Their technology also needs to be considered. Many companies are replacing older equipment with the latest GPUs from manufacturers like Nvidia, resulting in a large volume of waste.
Reusing a GPU that has been operating continuously can be challenging, but some suggest that this “older” equipment should be repurposed for less demanding tasks such as web hosting, backups, and even donated to educational institutions.
Image | Prasopchok
View 0 comments