ChatGPT now handles 2.5 billion prompts daily, OpenAI confirms

Daniel Sims

Posts: 1,989   +56
Staff
Recap: Generative AI and large language models continue to gain popularity among individuals and companies despite mounting research that questions the technology's usefulness. As AI investments expand, climbing daily usage statistics will likely intensify scrutiny of the environmental impact.

OpenAI recently told Axios that its popular generative AI chatbot, ChatGPT, receives over 2.5 billion prompts per day globally. Of those, over 330 million, or around 13 percent, come from the US.

Without providing exact numbers, the company also confirmed that most of its over 500 million weekly active users utilize the free version. Bloomberg previously reported that ChatGPT has approximately 3 million paying users, a 50 percent increase from earlier this year.

The numbers could draw more attention to one of the primary criticisms of LLMs: their water and energy consumption. OpenAI CEO Sam Altman claims that ChatGPT prompts use approximately 0.34 watt-hours and around 0.32 milliliters of water, or one-fifteenth of a teaspoon on average. However, recent research suggests that the true figure could vary wildly depending on the LLM used.

Researchers from the universities of Rhode Island and Tunis estimate that GPT-4o, the default model for free ChatGPT users, is one of the most efficient, using between 1ml and 2ml per prompt. Still, other variants of GPT-4 can consume between 0.1ml (GPT-4.1 nano) and 30ml (GPT-4.5).

Based on Altman's estimate, 2.5 billion prompts per day might consume roughly 212,500 gallons of water, or over 77 million gallons, annually. However, researchers believe GPT-4o's yearly water footprint could be several times higher. Cooling-water and infrastructural water usage for the LLM's data centers might permanently remove at least 352 million gallons of freshwater from local ecosystems by the end of 2025, or enough drinking water for 1.2 million people. ChatGPT will also likely consume millions to billions of watt-hours this year, roughly equivalent to the usage of thousands of households.

As the technology's appetite and popularity grow, so do doubts over its future. LLM usage is becoming endemic in higher education, accelerating in courtrooms, and becoming mandatory at some companies. Yahoo Japan expects AI usage to double productivity by 2028, but multiple reports indicate that the technology actually slows workers down.

Microsoft cut 9,000 jobs to offset an $80 billion AI investment. The technology also recently made Nvidia the world's first $4 trillion company. However, a top economist warns that it has created a larger repeat of the 2000 dot-com bubble. The S&P 500 may have become dangerously dependent on massive AI bets from a few companies, and EZPR's Ed Zitron recently outlined how the sector's growth mostly stems from sales of Nvidia data center GPUs. The chipmaker might have become a single point of failure for an industry that has yet to demonstrate significant profitability.

Permalink to story:

 
If I understand this correctlly................AI uses vast amounts of not only power but also water, and mining and processing the rare earth minerals that modern technology needs also needs it, and fracking injects chemicals into the strata that can also pollute ground water, and people are burying toxic waste in the same way while we also need water for farming and let's not forget drinking. All so we can get a machine to do our thinking and research for us.
 
If I understand this correctlly................AI uses vast amounts of not only power but also water, and mining and processing the rare earth minerals that modern technology needs also needs it, and fracking injects chemicals into the strata that can also pollute ground water, and people are burying toxic waste in the same way while we also need water for farming and let's not forget drinking. All so we can get a machine to do our thinking and research for us.

The same amount of resources would be used for general data centers and the same amount of gpu/asic/cpu/fpga .. yes AI has greater energy requirements... but the same amount of stuff mined from the earth would used elsewhere in the PC industry regardless
 
If AI has such a high cost, why in the last few months do search engines give away an AI answer for "Free" with *every single query* unasked for? Maybe a few hundred million queries get put into chatGTP or Co-pilot directly, fine... but the way it is now, the billions of google, bing, and even duck duck go searches all hit an LLM and suck these resources, and nobody asked them to.

SO if AI queries are so costly, to the point we were paying subscriptions to get full access to them a couple years ago.... then why are they generating an AI response to every single search on a regular search engine? Seems likes billions of wasted AI queries.
 
Back