AI’s Growing Thirst for Water and Power
EPA Loosens Power Rules as AI Demands Surge

Image via Mia Crox from Pixabay
In a move aimed squarely at keeping America at the forefront of the artificial intelligence (AI) race, the U.S. Environmental Protection Agency (EPA) has issued new guidance to ensure that data centers—those massive, humming hubs of digital activity—have the electricity they need to operate without interruption.
Last week, the EPA released a regulatory clarification allowing certain backup engines to operate for up to 50 hours per year during non-emergency conditions, easing prior restrictions under the National Emission Standards for Hazardous Air Pollutants (NESHAP) for Reciprocating Internal Combustion Engines (RICE). These engines are critical to keeping data centers online when the electric grid is under pressure.
The move is more than bureaucratic fine print—it’s a signal that the U.S. government is prioritizing infrastructure that supports the future of AI. For better or for worse.
“The Trump Administration is taking action to rectify the previous Administration’s actions to weaken the reliability of the electricity grid and our ability to maintain our leadership on artificial intelligence,” said EPA Administrator Lee Zeldin.
The decision also came in response to Duke Energy’s request to determine whether its PowerShare Mandatory 50 demand program complied with current EPA rules. According to the EPA, it does—opening the door for similar programs nationwide.
But as the nation powers up for a data-driven future, the cost isn’t just measured in dollars or kilowatt-hours.
AI’s Growing Thirst for Water and Power
Artificial intelligence might live in the cloud, but it has a very real, very physical footprint. Every prompt you run, every model you train, and every query you send to a chatbot pulls not only energy—but water.
Cooling data centers is a major culprit. The servers running large-scale AI models like GPT-4 generate tremendous heat and often rely on water-based cooling towers to stay operational. Some estimates suggest that each AI prompt can indirectly consume about 16 ounces of water, mostly through cooling-related evaporation.
Zoom out, and the numbers are staggering: global AI usage could demand 0.38 to 0.60 billion cubic meters of water annually—that’s roughly half the total annual water withdrawal of Denmark, or more than triple the usage of an entire country like Liberia.
A few more eye-openers:
- Microsoft’s water use jumped 34% from 2021 to 2022, hitting nearly 1.7 billion gallons.
- Google used 5.56 billion gallons in the same year, largely for data center cooling.
- Training OpenAI's GPT-3 alone reportedly consumed about 700,000 liters of fresh water.
- A modest 1-megawatt data center can use up to 26 million liters annually.
Water stress isn’t evenly distributed, either. Many tech firms are placing data centers in developing regions where water may already be scarce, raising serious concerns about geographic and environmental inequality. In areas already grappling with drought or infrastructure challenges, this trend could exacerbate local hardship.
All of this puts added pressure on power and environmental regulators to adapt policies that keep the grid strong while also protecting natural resources.
A Balancing Act for the Future
EPA Administrator Zeldin’s Powering the Great American Comeback Initiative is part of a broader strategy to navigate that balance—unleashing all forms of domestic energy while also ensuring energy reliability, affordability, and now, increasingly, environmental sustainability.
The new clarification is a technical but critical piece in the puzzle. As data centers scale and AI adoption surges, the infrastructure to support it must evolve in lockstep.
It’s clear the future of AI isn’t just about algorithms and innovation—it’s about how we power and cool the machines behind the scenes, without overheating the planet.
Looking for a reprint of this article?
From high-res PDFs to custom plaques, order your copy today!