Researchers found some LLMs create four times the amount of CO2 emissions than other models with comparable accuracy. Their findings allow users to make informe
Is it just me or is that stupid way to measure consuming computing power? The CPUs themselves doing computations do not produce any pollutants (unless you calculate how much of that is created during manufacturing ang logistics, which I doubt). It’s the (without question stupidly large) energy consumption which might, but big players are at least greenwashing their actions by using renewable energy more and more.
Why not create comparison like “generating 1000 words of your fanfiction consumes as much energy as you do all day” or something more easily to compare.
Why not create comparison like “generating 1000 words of your fanfiction consumes as much energy as you do all day” or something more easily to compare.
Because its still bullshit. The bulk of the utilization of these LLMs isn’t going to your 1000 word Princess Leia/Dianna Troy lesbian romance. They’re going to some call center in the Philippines blowing up your cell with automated voice-to-text phone calls and a bargain basement Netflix animation studio experimenting with AI generated children’s cartoons.
Once again, we have a giant Business Factory spewing out enormous plums of waste to produce things nobody asked for. Then we’re getting an Op-Ed from some know-nothing hack on the greenwashing beat to tell their readers “Um, aktuly, these skyrocketing emissions are because you asked Alexa to add kidney beans to your shopping list.” And I will put even money odds on this Op-Ed, itself, being AI generated.
Obviously. But I have no context on how much my actions create co2 in the first place. I assume driving a car generates a majority of it, or maybe heating the house, but I still don’t have any clue how many kilograms that might be. But what I do know is how many kilowatts my house consumes electricity and at least roughly how much our appliances use, so if you want to try and blame me for consuming precious resources by generating text or watching a video at least give me an measurement I can easily comprehend.
FWIW, a short query to a typical sized LLM takes about 1Wh of energy, there lots of variance on how big the model you are using and how long the input and outputs are but thats the correct order of magnitude. 1Wh is the amount of energy consumed by a 1kW electric kettle in 3.6 seconds or a 2kW hairdryer in 1.8 seconds.
if you assume that energy was produced in a coal power plant (the worst for co2 emissions) then it makes around 0.3g of co2 emissions, which is the equivalent of burning about one droplet of gasoline.
Generally, heating and cooling are the main energy consumption for domestic purposes. next up is the car, and then electrical consumption. (from what i remember).
as long as you don’t take a transatlantic trip your fine:
“For example, having DeepSeek R1 (70 billion parameters) answer 600,000 questions would create CO2 emissions equal to a round-trip flight from London to New York. Meanwhile, Qwen 2.5 (72 billion parameters) can answer more than three times as many questions (about 1.9 million) with similar accuracy rates while generating the same emissions.”
i don’t know y’all, but i can say it takes me a long-ass time to ask that many questions.
Generally, heating and cooling are the main energy consumption for domestic purposes. next up is the car, and then electrical consumption. (from what i remember).
I suppose it depends on where you live. Our house consumes something over 20 000kWh per year as our heating is also electric (and rest of the consumption is pretty neglible compared to heating) and we also have a fireplace which consumes around 15m³ of firewood, depending on how cold winter happens to be. Electric grid here has a ton of renewables and nuclear, so co2 footprint should be on the smaller side compared to global average.
Also, as google and microsoft (among others) shoehorns AI “answers” to everything that adds up, but private use seems to be quite insignificant anyways.
I’m no fan of AI “answers”, because if i search for something, i’d like to have access to the source or at least know if i depend on a random social media post as my answer. I’m also pretty sure that - if they are smart, and they (mostly) are - caching of questions and answers will cut down on the amount of total questions asked.
and then there are things like grok, which fuck with air quality because elon couldn’t wait until the power grid was usable at his datacenter (or open a datacenter where you have access to the required amount of power) and uses dozens of gas turbines for power (without permits, because when the penalty is a fine, it’s just the cost of doing business)
Why not create comparison like “generating 1000 words of your fanfiction consumes as much energy as you do all day” or something more easily to compare.
Considering that you can generate 1000 words in a single prompt to ChatGPT, the energy to do that would be about 0.3Wh.
That’s about as much energy as a typical desktop would use in about 8 seconds while browsing the fediverse (assuming a desktop consuming energy at a rate of ~150W).
Or, on the other end of the spectrum, if you’re browsing the fediverse on Voyager with a smartphone consuming energy at a rate of 2W, then that would be about 9 minutes of browsing the fediverse (4.5 minutes if using a regular browser app in my case since it bumped up the energy usage to ~4W).
Because it really doesn’t. For most tasks, it would require more human energy to do the work than an LLM, just because we are much slower at it than an AI. I mean, humans operate at around 80 W just by existing (basal metabolic rate).
If the AI is powered by renewables, it’s cleaner than humans. If it’s powered by fossil fuels, it’s likely much worse (though I haven’t run the calculations).
Now obviously, this presumes that the output of an AI is even valuable at all, which is often not the case.
I’m somewhat in agreement I think. Is it really me talking to ChatGPT about the Holographic Theory in quantum mechanics, and why the Mac version of Brother’s P-Touch software is such trash, that are destroying the environment? Or is it the soulless corporate CEOs laying off thousands of customer service reps in order to replace them with AI bots, that are really consuming all the energy? Not to mention all the lives they have directly and more immediately destroyed with their decisions.
Is it just me or is that stupid way to measure consuming computing power? The CPUs themselves doing computations do not produce any pollutants (unless you calculate how much of that is created during manufacturing ang logistics, which I doubt). It’s the (without question stupidly large) energy consumption which might, but big players are at least greenwashing their actions by using renewable energy more and more.
Why not create comparison like “generating 1000 words of your fanfiction consumes as much energy as you do all day” or something more easily to compare.
Because its still bullshit. The bulk of the utilization of these LLMs isn’t going to your 1000 word Princess Leia/Dianna Troy lesbian romance. They’re going to some call center in the Philippines blowing up your cell with automated voice-to-text phone calls and a bargain basement Netflix animation studio experimenting with AI generated children’s cartoons.
Once again, we have a giant Business Factory spewing out enormous plums of waste to produce things nobody asked for. Then we’re getting an Op-Ed from some know-nothing hack on the greenwashing beat to tell their readers “Um, aktuly, these skyrocketing emissions are because you asked Alexa to add kidney beans to your shopping list.” And I will put even money odds on this Op-Ed, itself, being AI generated.
Obviously. But I have no context on how much my actions create co2 in the first place. I assume driving a car generates a majority of it, or maybe heating the house, but I still don’t have any clue how many kilograms that might be. But what I do know is how many kilowatts my house consumes electricity and at least roughly how much our appliances use, so if you want to try and blame me for consuming precious resources by generating text or watching a video at least give me an measurement I can easily comprehend.
FWIW, a short query to a typical sized LLM takes about 1Wh of energy, there lots of variance on how big the model you are using and how long the input and outputs are but thats the correct order of magnitude. 1Wh is the amount of energy consumed by a 1kW electric kettle in 3.6 seconds or a 2kW hairdryer in 1.8 seconds.
if you assume that energy was produced in a coal power plant (the worst for co2 emissions) then it makes around 0.3g of co2 emissions, which is the equivalent of burning about one droplet of gasoline.
Generally, heating and cooling are the main energy consumption for domestic purposes. next up is the car, and then electrical consumption. (from what i remember).
as long as you don’t take a transatlantic trip your fine:
“For example, having DeepSeek R1 (70 billion parameters) answer 600,000 questions would create CO2 emissions equal to a round-trip flight from London to New York. Meanwhile, Qwen 2.5 (72 billion parameters) can answer more than three times as many questions (about 1.9 million) with similar accuracy rates while generating the same emissions.”
i don’t know y’all, but i can say it takes me a long-ass time to ask that many questions.
I suppose it depends on where you live. Our house consumes something over 20 000kWh per year as our heating is also electric (and rest of the consumption is pretty neglible compared to heating) and we also have a fireplace which consumes around 15m³ of firewood, depending on how cold winter happens to be. Electric grid here has a ton of renewables and nuclear, so co2 footprint should be on the smaller side compared to global average.
Also, as google and microsoft (among others) shoehorns AI “answers” to everything that adds up, but private use seems to be quite insignificant anyways.
I’m no fan of AI “answers”, because if i search for something, i’d like to have access to the source or at least know if i depend on a random social media post as my answer. I’m also pretty sure that - if they are smart, and they (mostly) are - caching of questions and answers will cut down on the amount of total questions asked.
and then there are things like grok, which fuck with air quality because elon couldn’t wait until the power grid was usable at his datacenter (or open a datacenter where you have access to the required amount of power) and uses dozens of gas turbines for power (without permits, because when the penalty is a fine, it’s just the cost of doing business)
https://www.desmog.com/2025/06/13/xai-data-centre-emits-plumes-of-pollution-new-video-shows/
Because if there is something that can be done in a responsible way, you can count on elon to do it in the most braindead way possible.
e: direct link to the video: https://www.youtube.com/watch?v=mSWgDOzfKRI
Recycling 2: Electric Bugaloo
Considering that you can generate 1000 words in a single prompt to ChatGPT, the energy to do that would be about 0.3Wh.
That’s about as much energy as a typical desktop would use in about 8 seconds while browsing the fediverse (assuming a desktop consuming energy at a rate of ~150W).
Or, on the other end of the spectrum, if you’re browsing the fediverse on Voyager with a smartphone consuming energy at a rate of 2W, then that would be about 9 minutes of browsing the fediverse (4.5 minutes if using a regular browser app in my case since it bumped up the energy usage to ~4W).
Because it really doesn’t. For most tasks, it would require more human energy to do the work than an LLM, just because we are much slower at it than an AI. I mean, humans operate at around 80 W just by existing (basal metabolic rate).
If the AI is powered by renewables, it’s cleaner than humans. If it’s powered by fossil fuels, it’s likely much worse (though I haven’t run the calculations).
Now obviously, this presumes that the output of an AI is even valuable at all, which is often not the case.
I’m somewhat in agreement I think. Is it really me talking to ChatGPT about the Holographic Theory in quantum mechanics, and why the Mac version of Brother’s P-Touch software is such trash, that are destroying the environment? Or is it the soulless corporate CEOs laying off thousands of customer service reps in order to replace them with AI bots, that are really consuming all the energy? Not to mention all the lives they have directly and more immediately destroyed with their decisions.