WTF?! Arm's CEO has sounded a warning bell about the energy requirements needed to advance AI algorithms. He cautions that in a few years, "AI data centers" could require so much electricity that it could jeopardize the US power grid.

Arm CEO Rene Haas has highlighted the unsustainable energy demand of AI technology, warning of potential severe consequences if significant breakthroughs are not achieved soon. Speaking before the announcement of a $110 million funding program for AI research at universities in the US and Japan, Haas emphasized the urgent need for effective research to prevent stagnation in AI development.

According to Haas, US-based AI companies currently consume around four percent of the country's overall power. However, by 2030, the operation of chatbots and remote generative services in AI data centers could require as much as 20 to 25 percent of the entire US power grid. Haas particularly emphasized the "insatiable" energy demand of large language models (LLMs) like ChatGPT.

These dire predictions regarding AI energy consumption are increasingly prevalent, with the International Energy Agency (IEA) projecting that this year's power consumption for AI data centers will be 10 times higher than it was in 2022. Despite being consumer-oriented web services, chatbots exhibit significantly higher power requirements compared to simple Google search queries.

The IEA estimates that a single ChatGPT request consumes almost 10 times as much energy as a Google search. If Google were to adopt LLMs for its Search service, the company would require an additional 10 terawatt-hours (TWh) of power per year. A recent report in The New Yorker stated that ChatGPT consumes more than half a million kilowatt-hours of electricity per day, whereas the average US household uses just 29 kilowatt-hours in the same timeframe.

The US government, along with government authorities worldwide, will likely be compelled to intervene and impose strict limits on power consumption for both traditional and AI data centers, as noted in the IEA report. Haas suggested that both hardware accelerators and AI algorithms need to become much more efficient, or this new era of relentless AI evolution may soon reach a halt.

However, improved efficiency could potentially be leveraged by Big Tech and AI companies to enhance AI computing capabilities while maintaining the same level of power consumption. Another potential solution to AI's energy challenge would involve expanding energy capacity, as major companies like Amazon are already trying to do.