In a recent episode of the Hard Fork podcast, New York Times hosts Kevin Roose and Casey Newton sat down with Andrew Marley, executive director of Effective Altruism DC, to investigate a question that rarely makes headlines: the water consumption of AI data centers.
Most people associate AI’s environmental impact with electricity, but cooling systems rely heavily on water. Large‑scale servers generate immense heat, and many facilities use water‑based cooling towers or evaporative chillers to keep temperatures in check. That water is drawn from local supplies, treated, and often discharged back into the environment, sometimes at higher temperatures.
Marley highlighted that precise figures are hard to pin down because AI workloads fluctuate and data‑center designs vary widely. However, industry analysts suggest that the AI sector could be responsible for anywhere between 10‑30 million cubic meters of water per year—a volume comparable to the annual consumption of a midsize city.
Complicating matters further, the rapid expansion of AI‑focused facilities in water‑scarce regions raises concerns about local ecosystems and community water access.
Both the podcast hosts and Marley emphasized several mitigation strategies:
The discussion underscored that water stewardship must become a core metric for AI developers and investors, alongside energy use and carbon emissions. As AI continues to reshape economies, transparent reporting on water usage will be essential for aligning technological progress with sustainable resource management.