I wrote a few months ago about the real price of AI – the ethical complexity, the environmental footprint, the fact that my own pirated book is probably in the training data of tools I now use and recommend.
I said then: “If your opinion on AI can fit on a bumper sticker, you’re probably wrong.”
I still believe that. And I’ve been increasingly frustrated watching the environmental conversation around AI collapse into exactly that – bumper stickers.
On one side: “AI is boiling the oceans and produces nothing but slop. ”
On the other: “AI concerns are overblown. Everything_is_fine.gif”
Neither is useful. Both are wrong.
And the bad data floating around makes it nearly impossible to have the nuanced conversations we actually need.
So I spent some time digging into the most-shared claims about AI’s environmental impact. It made me angry about how badly the conversation has been polluted.
Here’s one example that captures the problem perfectly.
The 4,500x error
A bestselling book on AI’s environmental impact claimed that a single Google data centre in Chile would use “more than one thousand times the amount of water consumed by the entire population of Cerrillos, roughly eighty-eight thousand residents.”
One building. Using 1,000 times more water than 88,000 people.
Or to put it another way, using the same amount of water as a city of 88 million.
Or to put it another way, using the same amount of water as Tokyo.
Well, Tokyo plus Paris.
Plus London.
Plus New York.
Plus Sydney, Rome, Los Angeles, Dublin, AND the rest of the entire population of Ireland.
Yes, I’m a bit frustrated at this misinformation.
The book was published by Penguin Random House. The author has an MIT engineering degree. The New York Times and The Economist praised it. A fact-checking team is thanked in the acknowledgements.
The claim is wrong by a factor of around 4,500.
Data scientist Andy Masley did the maths. The book’s figure implies each Chilean resident uses 0.2 litres of water per day – a fifth of a water bottle. Adults need 2-4 litres just to stay alive.
What happened? The source reported water usage in cubic metres. The book recorded it in litres. A 1,000x unit error, built into the central claim, sailing through fact-checkers, editors, and reviewers at every major publication.
Nearly 1,000 Amazon reviews but a solo blogger caught it first.
Look, mistakes happen in books. I get it. I have books with mistakes in them. But this wasn’t a typo. It was a central claim that implies a single building uses more water than a city of 88 million people. Nobody stopped to ask if that made sense.
This is what frustrates me. I genuinely care about AI’s environmental impact. I want to have a serious conversation about it. But I can’t do that when the starting data is complete misinformation – and when pointing out the errors gets you lumped in with people who think there’s no problem at all.
What the water numbers actually show
Google recently stated that a median Gemini AI query uses 0.26 millilitres of water. That’s roughly five drops – not the “half a bottle” claims made on social media.
For some perspective on aggregate use: US data centres consumed about 17.4 billion gallons in 2023.
That same year, US golf courses used 548 billion gallons.
Golf courses use 30 times more water than data centres.
Now, there’s an important caveat. Indirect water use – the water consumed at power plants generating electricity for data centres – is considerably larger, about 800 billion litres in 2023. The energy source matters enormously. A data centre running on coal has a very different water footprint than one running on solar or wind.
The water impact is real, and it’s worth taking seriously, but it’s also nowhere near “draining the aquifers” – and when we exaggerate it by thousands of times, we make it harder to address the actual problem.
If prestigious publishers can get the water usage stats wrong by 4,500x, what else might be off?
The energy picture
Global data centres used about 415 TWh of electricity in 2024. That’s roughly 1.5% of global electricity and 0.9% of energy-related emissions.
Real impact. Worth paying attention to. Nowhere near apocalyptic.
I ask the people who are most vocal about AI power usage: Did you reboil your kettle this morning after letting the water go cold? Or drive somewhere you could have walked? Or put the heating on instead of grabbing a sweater?
Any one of those uses more energy than hundreds of AI queries. Some use thousands of times more.
I’m not saying AI energy use doesn’t matter. I’m saying we do need to think about energy use and sustainability of everything we do.
And when it comes to data centres the story gets more interesting. Efficiency varies enormously depending on the type of facility.
Power Usage Effectiveness (PUE) measures overhead energy – cooling, infrastructure, i.e. everything that isn’t actual computing. A lower number is better.
| Facility Type | Typical PUE |
|---|---|
| Older enterprise/on-premise | 1.5 – 2.0 |
| Industry average | 1.55 – 1.8 |
| Hyperscale (Google, Microsoft, AWS) | 1.09 – 1.2 |
Google’s fleet-wide average is 1.09, which is 84% less overhead than the industry average.
What does this mean practically? If you’re running AI workloads on ageing on-premise servers, moving them to hyperscale cloud probably reduces your carbon footprint per unit of computation. The efficiency gap is real.
That’s genuinely good news. Here’s what complicates it.
The growth problem
The IEA projects global data centre electricity demand to double by 2030.
In the US, Lawrence Berkeley National Laboratory estimates data centres used 176 TWh in 2023 and projects 325-580 TWh by 2028. That’s potentially tripling in five years.
The driver of this usage is AI. GPU workloads are growing at roughly 30% per year, compared to 9% for conventional servers.
Individual queries are getting more efficient but we’re also running vastly more of them, so efficiency gains simply won’t keep pace with demand growth.
This is the honest conversation I wish we were having. The environmental cost per query is dropping. The total environmental cost is rising.
Both are true but neither fits on a bumper sticker.
The real question isn’t whether data centres are 1.5% or 3% of global electricity. It’s whether we’re building enough clean generation capacity to meet demand that doubles every few years. That’s a serious planning challenge – one that deserves serious conversation rather than points-scoring.
The accounting gets murky
When Google says it “matches 100% of its electricity use with renewable energy,” that’s technically true.
It’s also somewhat misleading.
Google buys Renewable Energy Certificates equivalent to its consumption. Those certificates might come from a Texas wind farm while a Virginia data centre draws from a coal-heavy grid. The maths balances on paper. The electrons don’t actually match.
A 2024 Guardian investigation found that location-based emissions for Google, Microsoft, Meta and Apple were 662% higher than their market-based figures. That’s a significant gap between the headline claim and the physical reality.
To their credit, Google and Microsoft now pursue “24/7 carbon-free energy” – hour-by-hour matching rather than annual balancing. That’s harder to achieve and more honest.
If you’re evaluating a provider’s sustainability claims, it’s worth asking which accounting method they use. The difference matters.
What this means for you
Hyperscale cloud probably does reduce your per-computation footprint. The PUE gap is real. Google at 1.09 versus your server room at 1.7 is a meaningful difference.
“Cloud is green” isn’t a free pass. Efficiency gains are real, but so is demand growth. More efficient infrastructure doing vastly more work doesn’t automatically shrink your overall footprint.
Water claims need sanity-checking. If someone cites alarming statistics, do the maths. The actual numbers are orders of magnitude smaller than viral claims suggest.
Emissions accounting is complicated. RECs and hour-by-hour matching are different things. It’s worth understanding which one your provider uses.
Question everything. Including this article. I’ve referenced IEA and Lawrence Berkeley National Laboratory data throughout. Check the sources yourself.
What I actually think
AI has real environmental costs. They’re growing. I’m genuinely concerned about that – I said as much in my previous piece on this topic.
Those costs are also nowhere near the catastrophe that bestselling books and viral studies claim. Many of those claims collapse under basic fact-checking.
I find myself in an uncomfortable middle ground. I care about this issue. I also can’t take seriously a conversation where the data is wrong by thousands of times and pointing that out makes you suspect.
There’s almost nothing I can do to stop the global AI race. The momentum is too strong. But I can try to make the conversation around it more honest – and that starts with getting the facts right.
Hyperscale efficiency is real. Demand growth is real. Renewable accounting is complicated.
Water use is modest in aggregate but depends heavily on location and energy sources.
The honest position sits in the middle. This is less shareable than either extreme but far more useful for actual decisions.
I’d rather deal with the real trade-offs than pick a team and defend a bumper sticker.

