archive.is link to article from allabout.ai at https://www.allaboutai.com/resources/ai-statistics/ai-environment/
Yeah, AI is shit and a massive waste of energy, but it’s NOTHING compared to the energy usage of the airline industry.
Friend, did you actually follow the link? Maybe just read the pictures?
I checked. The IEA says airlines generate about a gigaton of CO2, and it’s still growing since the dip of covid, which is perhaps where your infographic authors got their screwy figures, which are, like I suggested, the wrong order of magnitude.
Cite your source and compare also using your source?
https://lmgtfy2.com/query/?q=IEA
Like I said, the IEA. The International Energy Agency. I wonder if you’ve heard of them.You can throw scepticism as much as you like, dude, but
(1) I did not lie and
(2) your website is unreliable. Give it up.Again. LLMs are crap, they spout falsehoods all the time, they use unreasonably large amounts of data, but the airline industry pollutes a LOT more.
I begin to wonder whether your website was itself written by an LLM.
both those numbers are insignificant.
Picked at random, It also claims this:
Why does nighttime AI use burn dirtier energy? Fossil fuel dominance: Coal and gas supply up to 90% of overnight electricity. Solar drop-off: Solar disappears after sunset, while wind delivers only ~30% capacity at night. Peak carbon hours: Between 2–4 AM, grid intensity rises to 450–650 gCO₂/kWh, compared to 200–300 gCO₂/kWh in the afternoon.
This is complete bullshit in the UK, where energy is greenest in the small hours of the night when demand is low and the wind turbines are still turning. Least green and most expensive is late afternoon and evening, when energy usage spikes.
Let me reiterate. AI is crap. AI is a massive waste of energy, but your website has its calculations off in terms of order of magnitude when it comes to comparing the airline industry pushing tons of metal fast and hard into and through the sky with AI pushing a bunch of electrons through a bunch of transistors. Seriously, way off.
That says national not global
Just because something has a pretty infographic doesn’t make it true.
Shame to see this clickbait blog misinfo here, but the anti-ai sloppers wont let that stop them.
Wth?
It’s a shitty biased site with incorrect data, ironically what looks to have been written by an AI.
Have anything to back that up or are you just saying it’s FUD?
here you go, since the commenter didn’t reply: https://andymasley.substack.com/p/a-cheat-sheet-for-conversations-about
So I’m a little confused, genuinely- this is in good faith. The link you provided was saying “the chat bot” features you shouldn’t worry about. It in itself has a disclaimer that image/video/generation is NOT covered here.
To be sure, the wimpy little chat bot that hallucinates that people should add glue to their pizza to stop the cheese sliding off or that people should eat rocks-part of it isn’t burning the BIG watt-hours. There’s more to AI than unhinged crappy chat sessions.
This is correct, but not all Data Center Usage is GenAI too. I agree that video generation is pretty energy intensive and not something that should be done on the regular, but image generation runs here on my graphics card pretty fast, so it’s not a big deal and comparable to chatbot responses.
Watching an hour of Netflix uses about 0,8kWh, which is a lot more but noone tries to make people have a bad conscience for binge watching a series. That’s Datacenter usage too, and a lot more than GenAI.
AI is also used a lot in Science, and even if i know that scientists aren’t that popular in the US, we probably agree that this activity is summa summarum positive for humanity, so lets keep it.
So where is the AI usage you propose is worse than global air travel? Lets look at total numbers here:
Electricity usage is 30% of the total CO2 emitted in the USA. Data centers in total used 4.4% in 2023, estimates say this will reach between 6.7 and 12% in 2028. If we take the worst case, Data Centers will be responsible for a bit more than 3% of all CO2 emissions in the USA, best case about 1.7%.
That’s not nothing, but it’s still a lot less then global flight, which was responsible for 9% of CO2 emissions in the US transport sector, which amounts to about 4% of Total US emissions 2022. Please note we are talking US flight emissions here, NOT GLOBAL. Since most datacenters worldwide reside in the US by far, your statement is bullshit.
ETA: People, the real climate killers have not changed: Cars, Industry, Flight. Trying to create a bad conscience for peanuts in relation is not worth it AND WILL NEITHER MAKE PEOPLE JOIN YOU NOR RESCUE THE PLANET, because these lies are easily disproven and make the green movement look untrustworthy.
What is this masterpiece ? Pro-pornography subliminal propaganda ?
Which is why I threw up in my mouth a little when my boss said we all need to be more bullish on AI this morning.
Replace your boss with it.
Same. And they basically jizz their pants when they see a practical use for AI, but 9 out of 10 times there’s already a cheaper and more reliable solution they won’t even entertain.
There’s practical use for AI?
You should correct their spelling of “bullshit”
My boss is also a fuckwit
I’ve mentioned it before but my boss’s boss said only 86% of employees in his department use AI daily and it’s one of his annual goals to get that to 100%. He is obsessed.
They’re salivating at the chance to reduce head count and still make money. Employees are by far the largest cost for any company. They hate paying it out when it could be for them.
This is my main issue with it. I think its useful enough but only if it uses about the same energy as you would use doing whatever without it. Most conversations I had with someone trying to convince me it does not use to much power end up being very much like crypto ones were it keeps on being apples to oranges and the energy consumption seems to much. Im hoping hardware can be made to get the power use lower the way graphics cards did. I want to see querying an llm using about the same as searching for the answer or lower.
As soon as you see water consumption being called an issue you know it’s not to be taken seriously. Water doesn’t just disappear.
Surely you understand clean FRESH water is a limited resource. Renewable sure, but LIMITED.
They don’t need fresh water, nor do they dirty it.
Are you suggesting they’re cooling data centers with seawater
Water has to be cleaned. It is renewable, not free. My city has many sources, but most places where they build datacenters do not. It is a real problem
Either way the biggest proble, IMO, is the pressure on the electric grid. If the asshats building the centers would cooperate with others, there would be much less problems.
As for power usage, yea its a lot but still not an insane amount. The image and video generation uses a LOT more then text.
image gen isn’t so heavy either, but even i who arguments against this “AI is killing the environment”-bullshit agree on the video gen - there is a lot of power usage behind that one. might still not be an issue if using renewables, but if you use elon’s illegally gas turbine powered AI, then it’s fucking bad.
It is renewable, not free.
“Renewable” also doesn’t mean shit if the resource is being consumed faster than it’s being renewed. Ask the people who used to live on the shores of the Aral Sea how “renewable” their water was.
Makes me wonder what they are doing to reach these figures.
Because I can run many models at home and it wouldn’t require me to be pouring bottles of water on my PC, nor it would show on my electricity bill.Well, most of the carbon footprint for models is in training, which you probably don’t need to do at home.
That said, even with training they are not nearly our leading cause of pollution.
Article says that training o4 required equalivent amount of energy compared to powering san francisco for 3 days
Most of these figures are guesses along a spectrum of “educated” since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you’re looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.
That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn’t matter–there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).
Basically every tech company is using it… It’s millions of people, not just us…
Billions. Practically every Google search runs through Gemini now, and Google handles more search queries per day than there are humans on Earth.
Ew, who still uses Google Search?
OP, this statement is bullshit. you can do about 5 million requests for ONE flight.
i’m gonna quote my old post:
I had the discussion regarding generated CO2 a while ago here, and with the numbers my discussion partner gave me, the calculation said that the yearly usage of ChatGPT is appr. 0.0017% of our CO2 reduction during the covid lockdowns - chatbots are not what is kiling the climate. What IS killing the climate has not changed since the green movement started: cars, planes, construction (mainly concrete production) and meat.
The exact energy costs are not published, but 3Wh / request for ChatGPT-4 is the upper limit from what we know (and thats in line with the appr. power consumption on my graphics card when running an LLM). Since Google uses it for every search, they will probably have optimized for their use case, and some sources cite 0.3Wh/request for chatbots - it depends on what model you use. The training is a one-time cost, and for ChatGPT-4 it raises the maximum cost/request to 4Wh. That’s nothing. The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households. This is for one of the most downloaded apps on iPhone and Android - setting this in comparison with the massive usage makes clear that saving here is not effective for anyone interested in reducing climate impact, or you have to start scolding everyone who runs their microwave 10 seconds too long.
Even compared to other online activities that use data centers ChatGPT’s power usage is small change. If you use ChatGPT instead of watching Netflix you actually safe energy!
Water is about the same, although the positioning of data centers in the US sucks. The used water doesn’t disappear tho - it’s mostly returned to the rivers or is evaporated. The water usage in the US is 58,000,000,000,000 gallons (220 Trillion Liters) of water per year. A ChatGPT request uses between 10-25ml of water for cooling. A Hamburger uses about 600 galleons of water. 2 Trillion Liters are lost due to aging infrastructure . If you want to reduce water usage, go vegan or fix water pipes.
Read up here !
If you only include chat bots, your numbers look good. Sadly reality isn’t in “chat bots”.
Could you explain further?
Image/Video generation, analysis (them scrubbing the entire public internet) consumes far, far more than someone asking an AT “grok is this true”
Do you have a source for this claim? I see this report by Google and MIT Tech Review that says image/video generation does use a lot of energy compared to text generation.
Taking the data from those articles, we get this table:
AI Activity Source Energy Use (per prompt) Everyday Comparison Median Gemini Text Prompt Google Report 0.24 Wh Less energy than watching a 100W TV for 9 seconds. High-Quality AI Image MIT Article ~1.22 Wh Running a standard microwave for about 4 seconds. Complex AI Text Query MIT Article ~1.86 Wh Roughly equivalent to charging a pair of wireless earbuds for 2-3 minutes. Single AI Video (5-sec) MIT Article ~944 Wh (0.94 kWh) Nearly the same energy as running a full, energy-efficient dishwasher cycle. “Daily AI Habit” MIT Article ~2,900 Wh (2.9 kWh) A bit more than an average US refrigerator consumes in a full 24-hour period. Another way of looking at this: A “Daily AI Habit” on your table is about the same as driving a Tesla 10 miles, or a standard gas car about 3 miles.
Edit 4 AI videos, or detour and take the scenic route home from work… about the same impact.
I like that as well, thank you! Yeah, the “Daily AI Habit” in the MIT article was described as…
Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise.
Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram.
You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.
As a daily AI user, I almost never use image or video generation and it is basically all text (mostly in the form of code), so I think this daily habit likely wouldn’t fit for most people that use it on a daily basis, but that was their metric.
The MIT article also mentions that we shouldn’t try and reverse engineer energy usage numbers and that we should encourage companies to release data because the numbers are invariably going to be off. And Google’s technical report affirms this. It shows that non-production estimates for energy usage by AI are over-estimating because of the economies of scale that a production system is able to achieve.
Edit: more context: my daily AI usage, on the extremely, extremely high end, let’s say is 1,000 median text prompts from a production-level AI provider (code editor, chat window, document editing). That’s equivalent to watching TV for 36 minutes. The average daily consumption of TV in the US is around 3 hours per day.
please elaborate?
I’m not sure what you’re referencing. Imagegen models are not much different, especially now that they’re going transformers/MoE. Video gen models are chunky indeed, but more rarely used, and they’re usually much smaller parameter counts.
Basically anything else machine learning is an order of magnitude less energy, at least.
The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households.
Or about 10 small countries. Not even being that hyperbolic: American households are fabulously, insanely wasteful of energy.
The rest of the world (barring places like Saudi Arabia, which are rarely used as moral or socio-cultural examples the world should learn from) has done the whole ‘What’s the point in trying to better the world when America and China do more damage than the rest of the world combined?’ debate decades ago, and we ended up deciding that we can’t control the worst offenders, and can only do what we can.
Literally any moral value or standard is subject to ‘but but but what’s the point if you can’t eradicate the problem entirely?’, that’s why it’s such a weak fallacy. Minimising absolutely pointless destruction of non-renewable resources won’t successfully save the environment tomorrow, but we can do it anyway, and if will help. We can’t eradicate theft, but we can do our best to pay for things before taking them. We know that being polite in public isn’t the 1 thing holding our society back from utopian perfection, but we do it anyway, because it helps.
We can all pinky promise not to murder or violently assault anyone, and pay no attention to the weirdo protesting that ‘What’s the point in not assaulting people when actually, cars and illness and unhealthy lifestyles do more harm’, because that person is presumably just looking for an excuse to hit someone.
And yeah, long story short: using ‘American households’ as an example of how insignificant AI’s energy usage is is kinda like saying smoking is safe because it’s actually less harmful than spending 6 hours a day on a busy road in Delhi. If you don’t spend 6 hours a day near busy roads in Delhi, you won’t exactly think ‘oh that’s ok then’. And if you do, your lungs need all the help they can get and you’ve got all the more reason to be wary of smoking (I say this as a smoker btw).
Huge areas of Africa and the middle east are becoming uninhabited because of climate change. Those people all need food and water, and the western world does not have the resources or inclination to house and feed them all. It is almost unanimously described as the worst crisis humanity has ever faced, and the practical solution - stop wasting fossil fuels and non-renewable resources when there’s a viable alternative - is so insanely easy.
Billions of lives could be saved, if everyone on the planet agreed to be mindful of energy waste. Not ‘stop using energy’ or ‘everybody become vegan and live in houses made of recycled banana peel’, just quit wasting.
But there are entire countries who don’t seem to get the whole ‘acting together for the betterment of humanity’ thing, so that incredibly simple solution won’t work. And all we can do in the meantime is to lead by example, make ‘responsible consumption’ a lifestyle rather than an option, and hope against hope that enough Americans and Chinese people decide to reduce their dependence on 1000 daily images of shrimp Jesus or an endless output of bullshit papers written by AI to pretend that’s what science means, in time to maybe save some of the planet before wildfire season lasts 12 months a year.
Also: it’s not even like you’re gaining anything from constantly using AI or LLMs. Just fleeting dopamine hits while your brain cells wither. Of all the habits one could try to reduce, or be mindful of, to literally save lives and countries, anybody who honestly thinks generative AI is more important is very addicted.
Also also: it’s just so shit.
But there are entire countries who don’t seem to get the whole ‘acting together for the betterment of humanity’ thing,
I would describe it as ‘indoctrinated by Big Oil’, heh… It is awful.
Also: it’s not even like you’re gaining anything from constantly using AI or LLMs. Just fleeting dopamine hits while your brain cells wither. Of all the habits one could try to reduce, or be mindful of, to literally save lives and countries, anybody who honestly thinks generative AI is more important is very addicted. Also also: it’s just so shit.
The majority of text ingestion/token generation is consumed by other machines for stuff like coding assistants or corporate data processing, and this includes image ingestion. I dunno what fraction is image/video generation is, but I suspect it’s not high, as there’s really no point outside of cheap spam.
You are not wrong, and corpo AI is shit for plenty of reasons (including being needlessly power hungry when it doesn’t have to be), but I’m not relenting that this is a ‘small fish’ issue to pursue in reference to the massive waste in so many other parts of the US.
Big Oil and such delight in such distractions because it draws attention away from their more profitable and harmful sectors they’d rather people forget about.
If you want to look at it another way, if you assume every single square inch of silicon from TSMC is Nvidia server accelerators/AMD EPYCs, every single one running AI at full tilt 24/7/365…
Added up, it’s not that much power, or water.
That’s unrealistic, of course, but that’s literally the physical cap of what humanity can produce at the moment.
I did some research and according to some AI’s this is true. According to some other AI’s this is false.
The statement strikes me as overblown extreme position staking.
I use AI in my work, not every day, not even every week, but once in a while I’ll run 20-30 queries in a multi-hour session. At the estimated 2Wh per query, that puts my long day of AI code work at 60Wh.
By comparison, driving an electric car consumes approximately 250Wh per mile. So… my evil day spent coding with AI has burned as much energy as a 1/4 mile of driving a relatively efficient car, something that happens every 15 seconds while cruising down the highway…
In other words, my conscience is clear about my personal AI energy usage, and my $20/month subscription fee would seem to amply pay for all the power consumed and then some.
Now, if you want to talk about the massive data mining operations taking place at global-multinational corporations, especially those trolling the internet to build population profiles for their own advantages and profit… that’s a very different scale than one person tapping away at a keyboard. Do they scale up to the same energy usage as the 12 million gallons of jet fuel burned hourly by the air travel (and cargo) industries? Probably not yet.
9.6kWh of energy in a gallon of jet fuel, so just jet fuel consumption is burning over 115 Gigawatts on average, 24-7-365.
I hope you recycle as well!
I hope your recycling is net carbon neutral as well. Example: how much CO2 is released by a recycling program which sends big diesel trucks all over the city to collect recyclables including cardboard, sorting that cardboard at a facility, shipping a small fraction of that to a pulp recycling facility and making recycled cardboard from the post-consumer captured pulp? Consider the alternative to be: torching the cardboard at the endpoint of use - direct conversion to CO2 without the additional steps.
Don’t forget: new from pulpwood cardboard also is contributing to (temporary) carbon capture by growing the pulpwood trees which also provides groundwater recharge and wildlife habitat on the pulpwood tree farms - instead of the pavement, concrete, steel, electricity and fuel consumption of the recycling process.
“Dear expensive thing: Are you wasteful?”
“Uh…yes? Wait! No…”
Barely ever used it just for that reason and the fact that the algorithms are getting worse by the day. But now my work is forcing us to use it. To increase productivity you see…
I wonder how one gets banned from using these tools without just spraying non stop paste’s of expletives in to the chat box
From this page it turns out that every prompt is one glass of water. Is there any chance we run out of water at this point ?
There have been reports of AI data centers further draining water reserves in areas of non abundant nor sufficiently recovering water. Which has not only environment but social and human consequences in the area.
It’s so important to differentiate between commercial LLMs and AI as a general concept.
Generating bullshit that isn’t really that useful.
Remember when the Apple Newton “revolutionized” computing with handwriting recognition?
No, of course not, because the whole thing sucked and vanished outside of old Doonesbury cartoons. LOL
My peer used the newton for comp sci class notes. Daily. Exclusively.
Then she went on to mastermind the behaviour and tactics of Myth: The Fallen Lords.
It’s tenuous, but I say that’s causal.
What does it mean to consume water? Like it’s used to cool something and then put back in a river? Or it evaporates? It’s not like it can be used in some irrecoverable way right?
I kind of wondered the same thing in the past, but the other day I read an LA Times article that illustrated the extent of the problem of water loss (not particularly related to data centers although we know they contribute to it). The main problem with evaporating water seems to be that it was water that we could have used which ended up in the ocean instead.
https://www.latimes.com/environment/story/2025-09-03/global-drying-groundwater-depletion
I infer that evaporation is worse than flushing it down the drain, so to speak, because if it were flushed you would at least be able to treat and recover much of it using much less energy than recovering it from the ocean. So it sounds like evaporation is largely (but obviously not completely) a one-way street, especially in arid regions, since only a tiny portion of the evaporated water would come back there as rain.
“using” water tends to mean that it needs to be processed to be usable again. you “use” water by drinking it, or showering, or boiling pasta too.
They need to use very pure water, and it evaporates completely, so it must be continually replenished.
Need is a strong word. There are much more efficient ways to cool data centers. They’ve just chosen the most wasteful way because it’s the cheapest (for them).
To be clear: I’m saying that an ongoing supply of pure water is a requirement of the cooling method they chose, not that they were required to choose that method. The poster I was replying to asked how water could actually be consumed and not just reused.
I think the point is that it evaporates and may return as rain, which is overwhelmingly acid rain or filled with microplastics or otherwise just gets dirty and needs to be cleaned or purified again.
if they take the water and don’t return to the source, there will be less available water in the water body, and it can lead to scarcity. If they take it and return, but at a higher temperature, or along with pollutants, it can impact the life in the water body. If they treat the water before returning, to be closest to the original properties, there will be little impact, but it means using more energy and resources for the treatment
Well what you said is not true, but since you are so interested in this, why limit it to AI? Just quit using computers all together.
What is not true?
They said that AI is polluting worse than global air travel. They are mixing up pollution vs energy used. If it was pollution global air travel creates 80 Million Tons of CO a month. All AI in use is 15 million tons a month. Global air travel is far more polluting.
As an aside, and this is crazy: there is a reference, in the article OP posted to a paper, that suggests that humans are far worse than AI for CO2 creation depending on the task. Which I found surprising.
So I read the published paper in the journal Nature and:
Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writer, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts. Emissions analyses do not account for social impacts such as professional displacement, legality, and rebound effects. In addition, AI is not a substitute for all human tasks. Nevertheless, at present, the use of AI holds the potential to carry out several major activities at much lower emission levels than can humans.
Ok I honestly did not see that coming.
lmao time for a new wave of doomerism and anti-humanity
Very well thought out response. Will respond in kind.
HurrDeeeDurrr K
Thanks for clarifying. You made up statistics, your post is nonsense.
And you responded without any consideration that the consistent reliance on computers, in general, is using a HUGE amount of energy, AI or not, indicate that you simply want to chase windmills and not have a conversation. Well played.
HurrDeeeDurrrr indeed. Next time let the grown ups talk.
Gaslighting schmuck. I “made up” nothing. Good day.
Arrg! I didnt mean to delete what I wrote I was just trying to update it.
You confused energy use with pollution.
And what I wrote before was:
I basically said that I was serious, people if they cared would stop using computers. But I am not going to stop, you are not going to stop, so data centers are going to grow no matter what we do, and computing use is going to increase energy consumption. We need to (even says in the article you posted in the links) improve efficiency, get better hardware, use lower cost training models, use energy recovery and not use lossy evaporator cooling.
Edit: Arrrggg! I didnt mean to delete my comment I meant to add to it!
The point I was trying to make was that we are not going to quit using computers, AI or not. Data centers will continue to grow, AI will continue to be used because they are not going to quit either.
I said that before.
What I wanted to add: this is in agreement with several of the links in the article OP posted: more efficient algorithms, better hardware, regenerative water cooling instead of lossy evaporation.
The article suggests training on better models, heat recovery and re-use, even using AI to research better methods for processing and planning energy consumption for AI.
I was serious: we are unlikely to just throw out computers.