Generative artificial intelligence is here to stay. Nowadays, millions of people use tools like ChatGPT, Midjourney, or Suno AI to create various types of content. It’s easy for users: They simply open an app, input a prompt, and the desired content appears as if by magic. However, in reality, this is the result of a complex process that involves gathering extensive data, training language models, and inference processes.
These tasks are typically not performed on our personal devices. Instead, the companies behind GenAI apps rely on compute-intensive data centers to handle most of the work. It’s no secret that this type of infrastructure requires cutting-edge processing units, such as Nvidia’s H100 GPUs. However, it’s been challenging to gauge the exact resources involved due to limited public information. For now, we know these processes consume a significant amount of energy and water.
A ChatGPT Prompt, a Small Bottle of Water
Data centers consume water, but how much do they actually use? Finding an answer to this question is no easy task, but there’s some data that can help understand the issue. The current surge in AI development has led Microsoft to increase its water consumption significantly. According to the company’s 2022 Environmental Sustainability Report, its water consumption grew by 34% from 2021 to 2022. Interestingly, OpenAI launched ChatGPT in November 2021, and the model was in training for some time before that.
You might be wondering about the connection between ChatGPT, an OpenAI product, and Microsoft. As a matter of fact, the AI startup has a strategic relationship with the tech giant. Essentially, OpenAI uses the Azure AI cloud infrastructure to train and run its AI products.
Going back to water consumption and tech companies, Microsoft isn’t alone. Google’s worldwide water consumption also increased by 20% during the same period. Again, this coincides with the ongoing boom in AI development.
A recent report in The Washington Post sheds more light on this scenario. Researchers at the University of California concluded that composing a 100-word email using ChatGPT consumes 519 milliliters of water. This is an average calculation but provides a tangible idea of the processes that take place in data centers. It’s equivalent to a little more than half a liter of water, which is about the size of a standard water bottle.
Water is a crucial resource for the cooling systems of the largest data centers. However, its consumption varies depending on the centers’ location, the time of year, and the weather conditions. Environmental concerns related to the operation of these water-intensive facilities aren’t new. Often driven by regulations and activists, tech companies have long been working on new solutions to limit resource consumption.
For example, in 2018, Microsoft submerged an experimental data center in the ocean to address the temperature issue. In addition, companies like Meta and Google are looking to place servers in geographic locations where water isn’t scarce. Despite these efforts, GenAI will still need data centers to function, so it’s clear that they’ll continue to multiply in different parts of the world, driven by multi-billion dollar investments in a race to dominate the industry.
Image | Filip Baotić | Microsoft
Related | Mark Zuckerberg Keeps Saying That His AI Model Is Open Source, But He's Misusing the Term
View 0 comments