AI could soon consume more electricity than Bitcoin mining and entire countries

Skye Jacobs

Posts: 640   +13
Staff
A hot potato: The global AI industry is quietly crossing an energy threshold that could reshape power grids and climate commitments. New findings reveal that the electricity required to run advanced AI systems may surpass Bitcoin mining's notorious energy appetite by late 2025, with implications that extend far beyond tech boardrooms.

The rapid expansion of generative AI has triggered a boom in data center construction and hardware production. As AI applications become more complex and are more widely adopted, the specialized hardware that powers them, accelerators from the likes of Nvidia and AMD, has proliferated at an unprecedented rate. This surge has driven a dramatic escalation in energy consumption, with AI expected to account for nearly half of all data center electricity usage by next year, up from about 20 percent today.

AI expected to account for nearly half of all data center electricity usage by next year, up from about 20 percent today.

This transformation has been meticulously analyzed by Alex de Vries-Gao, a PhD candidate at Vrije Universiteit Amsterdam's Institute for Environmental Studies. His research, published in the journal Joule, draws on public device specifications, analyst forecasts, and corporate disclosures to estimate the production volume and energy consumption of AI hardware.

Because major tech firms rarely disclose the electricity consumption of their AI operations, de Vries-Gao used a triangulation method, examining the supply chain for advanced chips and the manufacturing capacity of key players such as TSMC.

The numbers tell a stark story. Each Nvidia H100 AI accelerator, a staple in modern data centers, consumes 700 watts continuously when running complex models. Multiply that by millions of units, and the cumulative energy draw becomes staggering.

De Vries-Gao estimates that hardware produced in 2023 – 2024 alone could ultimately demand between 5.3 and 9.4 gigawatts, enough to eclipse Ireland's entire national electricity consumption.

But the real surge lies ahead. TSMC's CoWoS packaging technology allows powerful processors and high-speed memory to be integrated into single units, the core of modern AI systems. De Vries-Gao found that TSMC more than doubled its CoWoS production capacity between 2023 and 2024, yet demand from AI chipmakers like Nvidia and AMD still outstripped supply.

TSMC plans to double CoWoS capacity again in 2025. If current trends continue, de Vries-Gao projects that total AI system power needs could reach 23 gigawatts by the end of the year – roughly equivalent to the UK's average national power consumption.

This would give AI a larger energy footprint than global Bitcoin mining. The International Energy Agency warns that this growth could single-handedly double the electricity consumption of data centers within two years.

While improvements in energy efficiency and increased reliance on renewable power have helped somewhat, these gains are being rapidly outpaced by the scale of new hardware and data center deployment. The industry's "bigger is better" mindset – where ever-larger models are pursued to boost performance – has created a feedback loop of escalating resource use. Even as individual data centers become more efficient, overall energy use continues to rise.

Behind the scenes, a manufacturing arms race complicates any efficiency gains. Each new generation of AI chips requires increasingly sophisticated packaging. TSMC's latest CoWoS-L technology, while essential for next-gen processors, struggles with low production yields.

Meanwhile, companies like Google report "power capacity crises" as they scramble to build data centers fast enough. Some projects are now repurposing fossil fuel infrastructure, with one securing 4.5 gigawatts of natural gas capacity specifically for AI workloads.

The environmental impact of AI depends heavily on where these power-hungry systems operate. In regions where electricity is primarily generated from fossil fuels, the associated carbon emissions can be significantly higher than in areas powered by renewables. A server farm in coal-reliant West Virginia, for example, generates nearly twice the carbon emissions of one in renewable-rich California.

Yet, tech giants rarely disclose where or how their AI operates – a transparency gap that threatens to undermine climate targets. This opacity makes it challenging for policymakers, researchers, and the public to fully assess the environmental implications of the AI boom.

Permalink to story:

 
I would be more interested in this topic if attempted to net out the power requirements to do the same work without the AI/data center. I feel some data centers must be still be far ahead just from hosting video conferences in lieu of business travel (I was one of those people who pre-covid would regularly fly across the country for routine check-ins with clients.) I wonder what the energy usage of each one of those trips was.

If he wants to see a really major change, he could compare now to the energy usage in these countries before the industrial age, where not just economic output but also population size was much smaller.

How far backward would he like to go?
 
"A server farm in coal-reliant West Virginia, for example, generates nearly twice the carbon emissions of one in renewable-rich California."

Is this the same California that cannot generate enough electricity to supply the state? 30% of the electricity comes from outside the state.

On the bright side, at the rate companies, jobs, and citizens are leaving California, It won't be long before they won't need any of that out of state energy, since demand will drop accordingly.
 
Is this the same California that cannot generate enough electricity to supply the state? 30% of the electricity comes from outside the state.
No, but to just buy someone else's surplus of...... anything, surprise you?
California, Arizona and Utah arranged this way before you people turned it
in to a problem that didn't exist.

It's accurate. Sources at the bottom for reference.

And before you tell more tall tales, California's last rolling blackout was August 2020.
 
Well maybe countries should stop sitting on their arse and get the whole "nuclear power" thing going again. Then we wouldn't have the emissions problem.

And boy, if you think this is bad, just wait until we all have to drive EVs, use electric heat in our homes, ece.
I agree that nuclear power is the way to go in the future. Concerning ev’s it’s not so bad concerning overall powerconsumption - A total estimate for Norway - which has a 97% adoptation rate for EV’s will be around 3%. Electrical heating is a much higher percentage if you’re moving from Gas. Datacenters on the other hand is insane, Google is building a datacenter 45 min from where I live and the estimate is that it’ll consume 20% of the power in the entire region
 
Well maybe countries should stop sitting on their arse and get the whole "nuclear power" thing going again. Then we wouldn't have the emissions problem.

And boy, if you think this is bad, just wait until we all have to drive EVs, use electric heat in our homes, ece.

Not necessarily a problem actually if people charge their cars at night when generator output is traditionally lower because demand is traditionally lower.

Baseline plants might potentially be able to meet that upswing in demand at least in the short term.

"A server farm in coal-reliant West Virginia, for example, generates nearly twice the carbon emissions of one in renewable-rich California."

Is this the same California that cannot generate enough electricity to supply the state? 30% of the electricity comes from outside the state.

On the bright side, at the rate companies, jobs, and citizens are leaving California, It won't be long before they won't need any of that out of state energy, since demand will drop accordingly.

Not according to my sources. Read plenty of peer-reviewed papers and would you believe there have been periods of time that power plants had to be cut from the grid because there was too much power and the excess had to be given away to neighboring states? Actually Cali paid them to do so to prevent grid instability.
 
What were humans bred and used for in The Matrix?
I can’t seem to recall.
/ sarcasm off
Thankfully, physics would say that the energy used to feed humans and keep us living would be greater than our power output. It’s the only way I sleep at night. Also, Animatrix was horrifying and legit gives me nightmares. The animated shorts that came as a special feature on “I am Legend” did too actually…
 
Any "climate commitments" should be immediately scrapped and forgotten, along with all the associated stupidities like measuring 'carbon footprint', carbon trading scams / taxes / etc..

It's id1otic to voluntarily restrict ourselves. Any available energy source should be used without any restrictions - coal, oil, gas, solar, hydro, nuclear - absolutely everything. Then AI will not be a problem.
 
The most ironic twist is that AI was pitched as the path to optimizing energy use, and now it might need its own dedicated power grid. Efficiency gains are great… unless you're scaling compute like it's a race to terraform Mars. The singularity isn’t robots taking over, it’s your GPU melting the planet.
 
Back