Recap: Generative AI and large language models continue to gain popularity among individuals and companies despite mounting research that questions the technology's usefulness. As AI investments expand, climbing daily usage statistics will likely intensify scrutiny of the environmental impact.

OpenAI recently told Axios that its popular generative AI chatbot, ChatGPT, receives over 2.5 billion prompts per day globally. Of those, over 330 million, or around 13 percent, come from the US.

Without providing exact numbers, the company also confirmed that most of its over 500 million weekly active users utilize the free version. Bloomberg previously reported that ChatGPT has approximately 3 million paying users, a 50 percent increase from earlier this year.

The numbers could draw more attention to one of the primary criticisms of LLMs: their water and energy consumption. OpenAI CEO Sam Altman claims that ChatGPT prompts use approximately 0.34 watt-hours and around 0.32 milliliters of water, or one-fifteenth of a teaspoon on average. However, recent research suggests that the true figure could vary wildly depending on the LLM used.

Researchers from the universities of Rhode Island and Tunis estimate that GPT-4o, the default model for free ChatGPT users, is one of the most efficient, using between 1ml and 2ml per prompt. Still, other variants of GPT-4 can consume between 0.1ml (GPT-4.1 nano) and 30ml (GPT-4.5).

Based on Altman's estimate, 2.5 billion prompts per day might consume roughly 212,500 gallons of water, or over 77 million gallons, annually. However, researchers believe GPT-4o's yearly water footprint could be several times higher. Cooling-water and infrastructural water usage for the LLM's data centers might permanently remove at least 352 million gallons of freshwater from local ecosystems by the end of 2025, or enough drinking water for 1.2 million people. ChatGPT will also likely consume millions to billions of watt-hours this year, roughly equivalent to the usage of thousands of households.

As the technology's appetite and popularity grow, so do doubts over its future. LLM usage is becoming endemic in higher education, accelerating in courtrooms, and becoming mandatory at some companies. Yahoo Japan expects AI usage to double productivity by 2028, but multiple reports indicate that the technology actually slows workers down.

Microsoft cut 9,000 jobs to offset an $80 billion AI investment. The technology also recently made Nvidia the world's first $4 trillion company. However, a top economist warns that it has created a larger repeat of the 2000 dot-com bubble. The S&P 500 may have become dangerously dependent on massive AI bets from a few companies, and EZPR's Ed Zitron recently outlined how the sector's growth mostly stems from sales of Nvidia data center GPUs. The chipmaker might have become a single point of failure for an industry that has yet to demonstrate significant profitability.