OpenAI chief says viral “17 gallons per query” claims are false, calls for faster push toward nuclear and renewables
OpenAI CEO Sam Altman has rejected claims that artificial intelligence is guzzling vast amounts of water, calling viral estimates “completely untrue,” while acknowledging that energy consumption remains a legitimate concern as AI adoption accelerates.
Speaking during his visit to India for the AI Impact Summit, Altman said widely circulated figures about water use per ChatGPT query are based on outdated assumptions about data centre cooling.
“Water is totally fake. It used to be true. We used to do evaporative cooling in data centres, but now we don’t do that,” Altman said.
He specifically dismissed online claims suggesting a single ChatGPT query consumes 17 gallons of water, calling them “totally insane” and disconnected from reality.
Water vs. power: Where the real strain lies
While pushing back on water usage narratives, Altman conceded that AI’s growing electricity demand deserves scrutiny.
“It’s fair to be concerned about the energy consumption — not per query, but in total, because the world is now using so much AI,” he said.
The distinction matters.
- Per-query comparisons can be misleading
- Aggregate power demand is rising rapidly
- Data centre expansion is accelerating globally
Altman argued that scaling AI strengthens the case for expanding nuclear, wind and solar energy to meet future compute needs.
Debunking per-query energy comparisons
Altman also dismissed claims equating a single AI prompt to multiple smartphone battery charges.
“There’s no way it’s anything close to that much,” he said.
He suggested that conversations around AI’s environmental cost often lack context—particularly when energy used to train AI models is compared in isolation.
“People talk about how much energy it takes to train an AI model. But it also takes a lot of energy to train a human,” Altman said, noting that human intelligence requires decades of resource consumption.
In his framing, the more relevant metric is comparative efficiency:
How much energy does it take for an AI system, once trained, to answer a question versus a human performing the same task?
Measured that way, he implied, AI may already be competitive.
India’s clean energy edge
Altman’s remarks come as India sharpens its AI ambitions while confronting infrastructure constraints.
At the same summit, Union IT Minister Ashwini Vaishnaw acknowledged that AI is energy- and resource-intensive, but highlighted India’s relative strength:
- Over 51% of installed power capacity comes from clean sources
Vaishnaw said nuclear power is under consideration alongside renewable expansion to support future compute demand. He described power and infrastructure as challenges—but manageable ones with coordinated policy and startup collaboration.
The bigger picture
The environmental debate around AI is intensifying as models grow larger and deployment widens.
Water use, once a flashpoint, may be fading as cooling technologies evolve. Energy, however, remains central.
As AI becomes embedded across industries, the sustainability question shifts from “Is it wasteful?” to “Can grids keep up?”
For Altman, the answer lies not in slowing AI—but in accelerating clean energy.
TL;DR:
Sam Altman dismissed claims that AI uses excessive water per query, calling viral figures “totally untrue.” He acknowledged that overall energy consumption is a valid concern as AI scales, urging faster investment in nuclear and renewable power. India, with 51% clean power capacity, sees opportunity in supporting AI growth sustainably.
AI summary:
- Altman rejects AI water-use claims
- Says evaporative cooling largely outdated
- Energy demand is the real issue
- Calls for nuclear and renewables
- India has 51% clean power capacity








