Generative AI services are truly spectacular energy-wasting machines, says research

Alfonso Maruccia

Posts: 1,025   +302
Staff
A hot potato: Machine learning algorithms have taken the world by storm, and the world will likely suffer from the increasingly popular generative AI services available through online subscriptions. For the first time, scientists have calculated how much energy these services need – it's a lot.

Generative AI services are truly spectacular energy-wasting machines, and AI-based image "creation" is the worst activity possible when it comes to carbon emissions. A recently published study from AI startup Hugging Face and Carnegie Mellon University tries to understand the impact of AI systems on the planet, analyzing different activities and generative models.

The paper examined the average quantity of carbon emissions produced by AI models for 1,000 queries, finding that generating text is a significantly less-intensive activity compared to image generation. A chatbot answering up to 1,000 queries consumes about 16% of the energy needed for a full smartphone charge, while image generation through a "powerful" AI model can take as much power as a full recharge.

Study lead Alexandra Sasha Luccioni said that people think about AI as an "abstract technological entity" that lives on a "cloud," with no environmental impact whatsoever. The new analysis demonstrates that every time we query an AI model, the computing infrastructure sustaining that model has a substantial cost for the planet.

Luccioni's team calculated carbon emissions associated with 10 popular AI tasks, using the Hugging Face platform for question answering, text generation, image classification, and more. The scientists developed a tool to measure the energy used by those tasks called Code Carbon, which calculates the power used by a computer running the AI model.

Using a powerful model like Stable Diffusion XL to generate 1,000 images, the study found, causes as much carbon emissions as driving an "average" gasoline-powered car for 4.1 miles. The least carbon-intensive model for text generation was responsible for as much CO2 as driving 0.0006 miles in a similar vehicle.

Using large, complex generative models is a much more energy-intensive affair than employing smaller AI models trained on specific tasks, the study further explains. Complex models have been trained to do many things at once, therefore they can consume up to 30 times more energy compared to a fine-tuned, task-oriented and simpler model.

The researchers also calculated that day-to-day emissions coming from AI services are significantly higher than the emissions associated with AI model training. Popular generative models like ChatGPT are used millions of times per day, and they would need just a couple of weeks to exceed the CO2 emissions associated with their training.

Vijay Gadepally, a research scientist at the MIT Lincoln lab, said that the companies profiting from AI models must be held responsible for greenhouse emissions.

Permalink to story.

 
No, it wont kill the planet. Just like r134a didnt kill the planet, and cows didnt kill the planet.

If these enviromentalists actually cared theyd be pushing for the building of new nuke plants to decommission all our coal and natural gas plants. But nuclear is icky, so.....

Makes one wonder if they're really part of a death cult and just want to kill us all off.
 
Who knew that banks of GPUs would consume so much power?
No, it wont kill the planet. Just like r134a didnt kill the planet, and cows didnt kill the planet.

If these enviromentalists actually cared theyd be pushing for the building of new nuke plants to decommission all our coal and natural gas plants. But nuclear is icky, so.....
Did you get that answer from your 8-ball or do your expect countries to build new nuclear plants just to power AI?
 
Who knew that banks of GPUs would consume so much power?

Did you get that answer from your 8-ball or do your expect countries to build new nuclear plants just to power AI?
Overall electricity usage is going up as it is and it is not primarily because of computing but EVs. If generating 1000 images with the most energy intensive AI engine uses the equivalent of 4.1 miles of gasoline (I really don't understand why kWh units were not used by MIT), then that would equate to the newly used electricity of 12 miles of driving an EV, far less than what the average EV driver is doing daily. Plus, each individual is not prompting 1000 AI generated images a day, but they are increasingly buying EVs and then driving it for 10+ years everyday.

The assessment you responded to is the correct one. Computing, EVs, and any new tasks using electricity do not need to be eliminated, but responded to by new electricity generation. The study claims "The responsibility here lies with a company that is creating the models and is earning a profit off of them," but instead this is wonderful for anyone in the business of generating electricity. They have the assurance that newly build power generation will have the future demand and can be profited off of.
 
Let's do some math and see if this information is correct. It claims that generating 1000 images with SDXL uses 2.9kWh of electricity. This is buried in the study (as opposed to 4.1mi of gas lol), so there is the number.

In comparison, a recent blog post by Stability AI that shows the hardware used for image generation significantly impacts energy usage: https://stability.ai/news/stability-ai-sdxl-gets-boost-from-nvidia-tensor-rt
Based on that, it takes 1500 seconds to generate 1000 images using an optimized workload with a single H100 accelerator. An H100 uses 700W (source) and adding 300W on top of that raises the power usage to 1kW. Applying the amount of time to that power usage results in a total of 0.41kWh of electricity (1/0.68img/s * 1000img / 3600s/hr * 1kW) or 14% of the claimed energy usage.

I think they using worst case scenario assumptions. They do not specify the image resolution used or the number of steps used in the image generation (similar to iterations run to refine the image). Stability AI lists its benchmarking as using 1024x1024 resolution images at 30 steps. The study notes it uses the unoptimized version of the AI (stable-diffusion-xl-base-1.0 instead of stable-diffusion-xl-1.0-tensorrt). It also notes it's using A100 80GB SXM which is older and less efficient (using 400W of energy and requiring 3700 seconds to generation 1000 images with SDXL).

Furthermore, Stability AI has recently released a "distilled" model that reduces the required step count from 50 to 1 and now takes a fraction of a second to generate an image: https://stability.ai/news/stability-ai-sdxl-turbo

These are further power savings that are not being accounted for in the paper. Essentially, the worst-case scenario in the study is non-news at this point.
 
Last edited:
Maybe focus on celebs that have 3 mansions and using planes/helicopters to travel even short distance. For Global warming meetings around the world why not just use ZOOM instead of 100's of elites traveling to one meeting like the one a few days ago.
 
Yet, AI apps can sell me dozens of images for $1 and still make money off the 70 cents after Apple's cut and the “insane” power usage for each image.

The math doesn’t check out carbon doomsdayers.
 
Who knew that banks of GPUs would consume so much power?

Did you get that answer from your 8-ball or do your expect countries to build new nuclear plants just to power AI?
It's not just AI. We are pushing for electric everything. Electric heat. Electric stoves. Electric yard tools. And the biggest one: electric cars. All this needs power from....somewhere. No matter how many protests are held, solar doesnt make power at night, and wind power doesnt make power when there is no wind or too much wind. Oh, and both desert-ify the land around them and wind turbines kill tons of birds.

Building a single nuke plant to take some coal plants offline will do far more for the environment then shutting down all AI related computation. The people wringing their hands and clutching their pearls over "AI environmental costs" are the kind of backyard drama Karens who shouldnt have any say in....anything. And in any other decade, would have been told to STFU. AI carbon output is a footnote on the global stage and a drop in the bucket compared to mandatory EV usage around the globe. People who are whining about it are doing so in order to feel important, there are entire oceans worth of fish to fry before AI carbon output becomes an issue.
 
It's not just AI. We are pushing for electric everything. Electric heat. Electric stoves. Electric yard tools. And the biggest one: electric cars. All this needs power from....somewhere. No matter how many protests are held, solar doesnt make power at night, and wind power doesnt make power when there is no wind or too much wind. Oh, and both desert-ify the land around them and wind turbines kill tons of birds.

Building a single nuke plant to take some coal plants offline will do far more for the environment then shutting down all AI related computation. The people wringing their hands and clutching their pearls over "AI environmental costs" are the kind of backyard drama Karens who shouldnt have any say in....anything. And in any other decade, would have been told to STFU. AI carbon output is a footnote on the global stage and a drop in the bucket compared to mandatory EV usage around the globe. People who are whining about it are doing so in order to feel important, there are entire oceans worth of fish to fry before AI carbon output becomes an issue.
And don't forget that researchers who can find even flimsy evidence of the potential for carbon anything are rewarded with funding and press coverage.

These incentives drive scientists more than most people realize.
 
And don't forget that researchers who can find even flimsy evidence of the potential for carbon anything are rewarded with funding and press coverage.

These incentives drive scientists more than most people realize.
I'm sure that, unlike the medical industry or any other sector, that the climate industry is TOTALLY immune to corruption and doesnt make everything alarmist for more money /s.
 
Proof of work crypto is pure wasting of power. AI at least does something useful, and would actually become more efficient as AI processors advance.
 
Crypto was a clear example how power intensive it can be. Of course corporations have to hype up AI as the in thing, and ignore the fact that it will be equally or more power/resource intensive. Just think for awhile the number of GPUs that Nvidia sold and the rated power draw of each of them to get a rough sense of the power required. And yet people are talking about sustainability. I don’t see how these complement each other.
 
Crypto was a clear example how power intensive it can be. Of course corporations have to hype up AI as the in thing, and ignore the fact that it will be equally or more power/resource intensive. Just think for awhile the number of GPUs that Nvidia sold and the rated power draw of each of them to get a rough sense of the power required. And yet people are talking about sustainability. I don’t see how these complement each other.
Crypto is way worse. At least with AI, you aren't wasting power from all the validator nodes that didnt get awarded bitcoin, nor are you duplicating expensive work every second for a "consensus mechanism".
 
Just because Crypto is even worse doesn't make a good defence for this. Crypto energy usage will be like a drop in the ocean when compared to the energy big companies intend to blow on AI. And 99.9% of that 'AI' will go on people drawing purple space cats or making deep-fake videos of brainless celebrities.
 
No, it wont kill the planet. Just like r134a didnt kill the planet, and cows didnt kill the planet.

If these enviromentalists actually cared theyd be pushing for the building of new nuke plants to decommission all our coal and natural gas plants. But nuclear is icky, so.....
When China launches nukes, you will be glad we at least do not have nuclear power plants to make contamination times worse.
 
When China launches nukes, you will be glad we at least do not have nuclear power plants to make contamination times worse.
If you mean "we" as in the US, you might be surprised to know there are 54 nuclear power plants in the US.
 
Just because Crypto is even worse doesn't make a good defence for this. Crypto energy usage will be like a drop in the ocean when compared to the energy big companies intend to blow on AI. And 99.9% of that 'AI' will go on people drawing purple space cats or making deep-fake videos of brainless celebrities.
The difference between crypto and AI, is that PoW crypto is designed to waste as much computing power as you throw into it. The more power-efficient mining hardware becomes, the harder the crypto problem becomes to balance the extra computing power.

AI, on the other hand, is an actual task. Its heavy power usage comes from not being optimised. There is every incentive for companies to have this task optimised, which can be done on the software AI side (like Stability AI's SDXL Turbo), or on the hardware side via drivers or special AI processors.

Sure, AI takes a lot of power and will continue to take a lot of power, but it's still in the research stages, and will become orders of magnitude more efficient. It will still take power, like anything widely used. BTW, I wonder how gaming compares in power usage.
 
Back