The phenomenon of “AI brain rot” in Silicon Valley takes many forms, but for OpenAI’s Sam Altman, it often manifests as vague pronouncements about artificial intelligence being the ultimate solution to global problems. Altman’s rhetoric escalated this week, with the CEO citing significantly downplayed figures for OpenAI’s water and electricity consumption, numbers that starkly contrast with numerous existing studies and raise questions about Sam Altman OpenAI resource usage transparency.
In a Tuesday blog post, Altman presented internal data on the energy and water required for a single ChatGPT query. He claimed one prompt consumes approximately 0.34 Wh, comparable to what “a high-efficiency lightbulb would use in a couple of minutes.” Regarding the cooling of data centers processing these AI queries, Altman suggested that a student using ChatGPT for an essay would necessitate “0.000085 gallons of water, roughly one-fifteenth of a teaspoon.” Altman provided no evidence for these assertions nor disclosed the origin of this data. MaagX.com contacted OpenAI for clarification but received no response.
Unpacking the Numbers: OpenAI’s True Water Footprint?
If we were to take Altman’s figures at face value, a simple calculation reveals a substantial water footprint. OpenAI has claimed that as of December 2025, ChatGPT boasts 300 million weekly active users generating 1 billion messages daily. Using Altman’s own metric for water per query, this would mean the chatbot consumes 85,000 gallons of water per day, equating to over 31 million gallons annually. ChatGPT operates on Microsoft data centers, already significant water consumers. While Microsoft plans for “closed-loop” facilities that eliminate external water for cooling, these projects are at least a year away from pilot testing.
Fresh numbers shared by @sama earlier today:
300M weekly active ChatGPT users
1B user messages sent on ChatGPT every day
1.3M devs have built on OpenAI in the US
— OpenAI Newsroom (@OpenAINewsroom) December 4, 2024
Contrasting Studies: A Starkly Different Picture
Data centers were already power- and water-intensive long before the rise of generative AI. Microsoft’s water usage notably increased between 2021 and 2022, following its partnership with OpenAI. A late 2023 study by University of California researchers indicated that the older GPT-3 version of ChatGPT consumed about 0.5 liters of water for every 10 to 50 queries. Extrapolating optimistically from this data, OpenAI’s earlier model would be using 31 million liters (approximately 8.18 million gallons) of water daily. This figure pertains to an older model, not the current, far more powerful (and resource-demanding) GPT-4.1, let alone its o3 reasoning model. [internal_links]
The Hidden Costs: Energy, Model Size, and Training
The scale of an AI model directly influences its energy consumption. Numerous studies have highlighted the environmental impact of training these models. As they are continuously retrained and become more advanced, electricity costs will inevitably rise. Altman’s figures fail to differentiate which queries are processed through its various ChatGPT products, including the premium $200-a-month subscription granting access to GPT-4o. Furthermore, his calculations conveniently ignore that generating AI images demands significantly more energy than processing text queries.
Silicon Valley Optimism vs. Environmental Reality
Altman’s entire blog post is laden with familiar big tech optimism, cloaked in talking points that often lack substance. He posits that data center production will become “automated,” leading the cost of AI to “eventually converge to near the cost of electricity.” Even if we charitably interpret this as AI expansion somehow offsetting its energy requirements, we are still faced with current realities and escalating global temperatures. Various companies have proposed solutions to the AI water and electricity dilemma, ranging from submerging data centers in the ocean to constructing dedicated nuclear power plants. However, long before any such nuclear facility could be operational, these companies will continue to rely on burning fossil fuels.
Beyond Environmental Costs: Societal Impact and the “AI Panacea”
The OpenAI CEO’s blog encapsulates the bullheaded thinking prevalent among tech oligarchs. He casually mentions that “entire classes of jobs” will become obsolete, dismissing the concern because “the world will be getting so much richer so quickly that we’ll be able to seriously entertain new policy ideas we never could before.” Altman and his peers have floated universal basic income (UBI) as a means to mitigate AI’s impact, yet there’s little indication of serious commitment. His efforts seem more focused on cozying up to figures like President Donald Trump to prevent future AI regulation than on genuinely advocating for such societal safety nets.
“We do need to solve the safety issues,” Altman concedes, yet this acknowledgment doesn’t temper the push to integrate AI into every facet of life. He implies we can overlook the warming planet because AI will eventually resolve this “niggling issue.” But as temperatures rise, thereby increasing the water and electricity needed to cool data centers, it’s doubtful AI can deliver solutions before irreversible damage occurs. The prevailing message seems to be: ignore the present crises and focus on the next, still unrevealed, Jony Ive-designed gadget that may well gaslight you as the world continues to burn.
This relentless downplaying of AI’s environmental and societal costs, exemplified by Sam Altman’s recent claims, demands critical scrutiny. While AI holds transformative potential, its current trajectory, fueled by unchecked optimism and a disregard for resource consumption, is unsustainable. Readers should question the narratives spun by tech leaders and seek a more balanced understanding of both AI’s promise and its peril, particularly concerning Sam Altman OpenAI resource usage and its broader implications.