Sam Altman, water claims and the energy debate around AI
Sam Altman, CEO of OpenAI, has landed in a heated debate over how much water and power AI really uses. Critics say his remarks reveal a technocratic mindset that treats human life like an efficiency problem, with one critic calling these people “deeply antisocial and antihuman.” The exchange has pushed questions about resource use and moral priorities into public view.
At a recent AI Impact Summit in India, the tension showed up in a photo-op when Anthropic CEO Dario Amodei refused to hold hands with him. That gesture was a visible sign of how personal and sharp these disputes have become among industry leaders. The encounter fed the larger argument about whether AI firms are out of touch with basic human concerns.
The topic of water use in AI data centers has circulated online for years, often wrapped in alarming figures about gallons per query. Altman took those claims head-on during an interview at the Express Adda event. He argued that many of the viral water numbers are outdated or misleading and that the real issue is total energy demand.
He said: “Water is totally fake. It used to be true, we used to do evaporative cooling in data centers, but now that we don’t do that, you see these things on the internet where [it’s] ‘don’t use ChatGPT, it’s 17 gallons of water for each query, or whatever’.”
“This is completely untrue, it’s totally insane, no connection to reality.
“What is fair, though, is the energy consumption – not per query, but in total – because the world is using so much AI is real and we need to move towards nuclear or wind and solar very quickly.”
The interviewer, Anant Goenka of The Indian Express, turned the conversation toward energy and mentioned Bill Gates’s view that AIs will learn from human evolution to use energy more efficiently. Altman pushed back on a simple apples-to-oranges comparison between training models and human learning. He urged a fuller accounting of where energy goes in both cases.
“One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query.”
“But it also takes a lot of energy to train a human. It takes like 20 years of life, and all the food you eat before that time, before you get smart. And not only that, it took like the very widespread evolution of the hundred billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever to produce you, and then you took whatever you took.”
“The fair comparison is if you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human, and probably AI has already caught up on an energy efficiency basis, measured that way.”
Altman’s framing did not sit well with some users on X/Twitter, who read it as devaluing human life in favor of computational efficiency. One wrote that “these people are deeply antisocial and antihuman”. Voices in the thread raised ethical alarms about treating people as costly biological training runs rather than ends in themselves.
He’s not just defending AI energy use. He is smuggling in a whole anthropology where humans are basically inefficient meat computers that you have to pour food and years into before they become useful. And once you accept that, the next move is obvious. If people are just costly biological training runs, then burning mountains of electricity to build synthetic intelligence starts to feel not only equal, but superior, even if it negatively impacts actual humans.
That is the dystopian. It makes human development sound like a bug in the system, and it makes sacrificing human and creational flourishing for more computational power sound logical. To him, the grid gets strained, prices go up, ecosystems get hit, but hey, humans eat too, so what’s the difference?
The difference is that humans aren’t an inefficient line item. They’re the point. If your worldview can look at a child growing into an adult and describe it as energy spent to train intelligence, you haven’t said something profound. You’ve revealed a horrifically rotten worldview. L. David Fairchild
The back-and-forth highlights a real tension: the technical claim that AI is becoming more efficient, and the moral concern that efficiency talk can slip into disregard for human costs. The debate is likely to continue as regulators, engineers and the public wrestle with how to power growth in AI without sidelining human well-being.
https://x.com/David_Fairchild/status/2025235795443913121?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E2025235795443913121%7Ctwgr%5E6261c8ea7e1d087f01e6370e3dc3858e3c01885e%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fwww.indy100.com%2Fscience-tech%2Fsam-altman-ai-chatgpt-water-energy-consumption

