English

Every email costs a bottle of water: why does generative AI guzzle so much energy?

23 Jun 2025

We’re massively using free tools like ChatGPT and Midjourney, but using them doesn’t come without a cost. Every prompt consumes electricity and water. Awareness of this is growing, but ready-made solutions are still lacking — even at the university.

According to ChatGPT itself, it’s not too bad. In response to the question “How much energy do I use by asking ChatGPT this question?”, the language model presents a few figures and concludes: “You probably just used less than 0.1Wh of energy – that’s much less than brewing a sip of coffee. So in terms of impact: really tiny!”

A typical case of ‘wij van wc-eend adviseren wc-eend’, you might say. ChatGPT is also known to occasionally make things up, the so-called hallucinations. Still, it’s true that a single question won’t make the difference. What worries people is mainly the accumulation of all the small and large questions we ask generative AI around the world.

‘Each year, about 30 to 40 percent more energy is used’

Generating images or videos consumes even more energy than text. And our requests are just the tip of the iceberg. Most of the electricity has already been consumed during the training phase, long before we ask for dinner inspiration or a picture of the Pope in a white puffer jacket.

Why does generative AI gobble up so much energy? And how should we deal with that at a university that holds sustainability in high regard?

Lots of Computing

The ICT sector as a whole consumes massive amounts of energy, says Bernard van Gastel, assistant professor of Sustainable Digitalisation at the Institute for Computing and Information Sciences. ‘Since the 1950s, energy use in ICT has been steadily rising, growing year by year.’ The most recent figures now include the impact of AI usage. ‘A few years ago, global consumption was increasing by 20 percent annually, now it’s about 30 to 40 percent.’

It’s hard to say exactly how much energy specific AI tools consume. Major players like Google, OpenAI (the large company behind ChatGPT), and the Chinese DeepSeek keep such information behind closed doors. Outsiders therefore rely on estimates, based on what they generally – and on a smaller scale – know about the training and use of AI models.

That’s how we know, for instance, that ChatGPT on average uses more energy than a search engine like Google, but estimates on the exact difference vary widely. ‘Roughly speaking, you could say it takes ten to twenty times more energy to ask a question to ChatGPT than to Google,’ says professor of Artificial Intelligence Johan Kwisthout.

While you see an answer appear in ChatGPT in no time, in the background the AI gears are turning at full speed: the program contacts a data center somewhere in the world, uses computing power to analyze what a logical response to your question would be, and delivers it to you as fast as possible.

But the bulk of the energy consumption is in training the tools. To teach AI the difference between a video of a cat and a dog, for instance, you need millions of hours of footage, Kwisthout explains. Such training requires enormous computing capacity and data storage. ‘For language models like ChatGPT, especially the newer versions, we’re talking about a scale we can no longer reach at universities.’

‘If you ask ChatGPT to write an email, that equals a small bottle of water’

By way of illustration: the data center Meta wanted to build in Zeewolde – a plan that the Council of State struck down in 2023 – would have covered 1.66 square kilometers. That’s significantly larger than the combined area of Radboud University, the hospital, and HAN, and, as far as is known, it wasn’t even specifically intended for AI.

Jevons Paradox

Such data centers consume not just energy, but also vast amounts of water to continuously cool the computers during calculations. How much water depends on where such a center is located. In a desert-like area such as Utah, cooling systems have to work harder than in the far north. On average, though, Kwisthout estimates: ‘If you give ChatGPT a few bullet points and ask it to write an email, that equals a half-liter bottle of water.’

Johan Kwisthout. Foto: RU

Also here, estimates vary. Researchers at the University of California calculated in a preprint study that half a liter is used per ten to seventy queries, depending on the data center.

But they, too, express concerns. They write that by 2027, the global demand for AI could require 4.2 to 6.6 billion cubic meters of water – half of the United Kingdom’s annual water usage. In a world where water scarcity is becoming an ever greater problem, that could become a major headache.

Although the industry doesn’t treat sustainability as a priority, Van Gastel does see a lot happening to make AI more efficient. The latest laptops, for example, come equipped with AI chips that can generate images or videos faster and more efficiently. ‘At first you think: that’s great. But when efficiency increases, usage tends to increase too.’

‘You don’t need ChatGPT for every search’

In economics, that effect is known as the Jevons Paradox, named after an economist who in 1865 observed that more efficient use of coal in industry led to more coal usage – because it became cheaper and more accessible. The same thing is now happening with AI, according to Van Gastel, as it increasingly seeps into our daily lives.

Local AI System

So how should a university that’s doing its best to shrink its ecological footprint respond to this? Is it possible to make AI usage on campus more sustainable? To answer that, it’s useful to understand the current – estimated – climate impact of AI use.

That impact isn’t being monitored yet, according to Mo Tiel, project leader of Radboud Sustainable. ‘Only recently have we, together with Information & Library Services, started looking into the energy consumption from data storage, both on university servers and in the cloud.’ That’s already going to be quite a task, Tiel expects, even without focusing specifically on AI usage. Still, she’s glad to see many conversations happening across campus, raising awareness. ‘You don’t need ChatGPT for every search.’

Not everyone is content with just talking. Van Gastel wants to experiment together with Jeroen de Jong of the Teaching & Learning Centre (TLC) with AI programs that run on the university’s own system, with local storage. For example, if you have medical students practice diagnostic conversations, the dialogue data stays within the university walls.

‘Where are these developments headed?’

That not only has privacy benefits, but it also means you can calculate how much energy is being used. ‘We can do that by testing a number of computers in our Software Energy Lab while the application is running,’ says Van Gastel.

He hopes their initiative will gradually gain broader traction within the university. ‘In the end, we all have to do this together – not just the techies.’ Ideally, every ICT or AI application would be evaluated as a ‘complete package’. ‘So not just environmental aspects, but also potential ethical and safety concerns. And think about economic sustainability. If you have to pay twice as much per student, then it simply doesn’t work.’

Kwisthout also stresses the need for such a broad assessment when it comes to responsible AI use. He also hopes Radboud University will take part in the political and societal debate beyond the campus. ‘Where are these developments headed, and what do we think of them?’ Because ultimately, like all major sustainability challenges, this isn’t something people can solve from within their own bubble.

Great that you are reading Vox! Do you want to stay up to date on all university news?

Thanks for adding the vox-app!

Leave a comment

Vox Magazine

Independent magazine of Radboud University

read the latest Vox online!

Vox Update

an immediate, daily or weekly update with our articles in your mailbox!

Weekly
English
Sent!