Recently, two articles were published and widely shared:
What these articles show is that the average personal usage of LLM-based chatbots like ChatGPT likely contributes a negligible share to the carbon footprint of a person living in a Western industrialized society.
Compared to other mundane online activities like video streaming, using chatbots consumes relatively little electrical energy and water. Hence, the second article claims that not using ChatGPT won’t measurably reduce your personal carbon footprint.
I do not intend to disprove these calculations. Since the companies operating AI chatbots do not report energy usage in meaningful ways, we do not have reliable data anyway. More importantly, these articles demonstrate again how distracting and eventually harmful the idea of a personal environmental footprint connected with individual moral responsibility is.
Considerate people in climate science and climate activism reason about states and economies, about decarbonization of whole sectors, about climate-neutral public infrastructure, about national policies and international treaties, about top-polluting industries, about holding corporations accountable to reach zero emissions, about installing gigawatts of photovoltaics, wind power and battery storage.
When researchers and activists talk about end consumers and private households, they discuss consumer prices, adjusting taxes on certain goods to incorporate external costs, subsidies for energy-efficient construction and renovation, as well as incentives and nudging. They talk about citizen initiatives for distributed clean energy, about local circular economies.
They do not shame consumers, say, for buying plastic when every available product is made of plastic, contains plastic or is wrapped in plastic. The reduction of global economic and ecological problems to individual lifestyle decisions is a well-known marketing ploy by top polluters to shift the responsibility from corporations to individuals.
While calculating the carbon footprint of certain activities might be valuable, it is not in the case of personal AI usage. You may endlessly dwell on the moral question what to do as a consumer. You might even gain some insights on how global economics work and where greenhouse gases are emitted. But in its societal function, the focus on individual carbon footprints is pure ideology.
The second article claims that “environmental objections to ChatGPT often dilute other serious criticisms of the technology.” Let us turn this thinking upside down: Environmental objections are necessary, valid and well-founded provided they address ICT and AI as a whole, not just the personal use of a particular chatbot. No matter how you look at it, the latter is part of the former. It cannot be singled out but also cannot be exempted.
The article acknowledges that “we should be focused on systematic change over individual lifestyles”, but utterly fails to do that. So let us examine the system that needs to be changed.
Computing in general and AI specifically consumes large amounts of energy, water and other natural resources. One way we can quantify this is the electricity consumption of data centres, which amounted to around 415 terawatt hours (TWh), about 1.5% of global electricity consumption in 2024 (International Energy Agency [IEA], Apr 2025).
That demand is growing rapidly (IEA, Feb 2025, PDF, p. 36). There are different estimates about the future absolute amount and relative share, and how the electricity will be generated (IEA, Berkeley Lab for the US Department of Energy, Dec 2024, Goldman Sachs Research, Feb 2025).
The IEA base scenario estimates a growth from 415 TWh to 945 TWh (× 2.28) by 2030, under 3% of total global electricity by 2030. The lowest scenario estimates around 700 TWh (× 1.69) by 2030, the highest 1,250 TWh (× 3.01). This surge is expected to be largely attributable to AI. The different scenarios take into account the projected growth of AI and how the power grid and efficiency progress can keep up with the demand.
While we don’t know if data centers will consume 2% or rather 4.4% of the global electricity in 2030, it becomes clear that for the society, for the planet, this is significant.
AI quickly gained economic and societal significance, as quickly as no technology in the last 25 years. Partly because consumers find AI tools useful. Partly because businesses integrate AI into their workflows. Partly because Big Tech is forcing AI on their customers without them being able to opt-out.
Billions of dollars of capital are flowing into the sector. Partly because Big Tech wants to grab power from workers, creatives and other industries through “disruption”. Partly because AI firms deceive and promise the moon. Their leaders subscribe to “utopian-apocalyptic” and fascist ideologies (“AGI”).
The proliferation of AI is already affecting real people:
- New data centers are being built in areas affected by water stress (Source Material, Apr 2025, Bloomberg, May 2025), for example in Arizona.
- New gas power plants are being built.
- Gas turbines are being set up next to AI data centers.
- New nuclear reactors are being built.
- Old, error-prone nuclear reactors are being switched on again.
Data centers are highly concentrated and localized. Their expansion in the wake of AI causes struggles in many places around the world, for real people and real communities, especially vulnerable and disadvantaged groups. Struggles about water, pollution and health, land grab, resource extraction and labor exploitation. There are many examples, from Mexico to Spain to Germany.
In the last 25 years, there has been no technology or economic activity for which politicians have been willing to sacrifice so much. AI companies are violating environmental regulations while governments are laxing safety standards. This deregulation at the expense of local communities and global climate is sugarcoated as “[streamlining] permitting and regulations”.
Local administrations and city councils attract AI companies to the detriment of their citizens. National governments are using authoritarian measures to allow their local tech firms to prevail in the global AI arms race.
The most glaring evidence of AI’s climate impact comes from technology companies themselves. It is not just critics who point out the energy demand of AI. Big Tech reports that it does not meet its own climate targets – and explains it with the skyrocketing electricity needs of AI. Instead of emitting less, these companies are emitting even more because of AI.
- “Google falling short of important climate target, cites electricity needs of AI” (AP). “In 2023, our total GHG emissions increased 13% year-over-year, primarily driven by increased data center energy consumption and supply chain emissions” (Google 2024 Environmental Report)
- “Meta emitted 14,067,104 metric tons of carbon dioxide equivalent in 2023, up slightly from 2022 and more than double what it reported in 2019, according to Meta’s 2024 sustainability report.” (Trellis)
- “The hype surrounding artificial intelligence is causing Microsoft’s emissions of climate-damaging gases to skyrocket.” (Heise)
These figures illustrate the catastrophic environmental impact of AI. These are the numbers we should be talking about. We need to hold Big Tech accountable and demand a quick turnaround.
These companies plan to operate climate-neutral, “net zero” by 2030. But the actual increase shows that we cannot believe their promises. They fancy climate neutrality, but they care more about keeping pace in the AI arms race.
Many companies investing in AI argue that AI will improve energy efficiency in such a dramatic way that it will offset all extra emissions, allowing them to meet their net zero goals. “First we will emit more, but soon, when all savings take effect, we will emit even less than before!”
We know this “techno-solutionist” ploy already: The magic technology that will “offset” our greenhouse gas emissions is just around the corner. For some players in the fossil industry, it is called Carbon Capture and Storage (CCS). For the transport sector, the magic technology is called electric vehicles or even green hydrogen. For some politicians and tech nerds, it’s called small modular reactors or even fusion reactors. For Big Tech, the smoke screen is called AI. Soon, AI will make up for all the emissions of AI and the rest as well!
Yes, non-generative AI will likely help to optimise production processes and save energy in many industries (IEA, Apr 2025). Yes, energy demand does not grow linearly with computing power, thanks to efficiency advancements. But what we observe so far is a rebound effect: The performance and efficiency gain is eaten by capitalist growth. Energy demand and carbon emissions are growing, not shrinking.
The recent AI race is a prime example of capitalism wasting resources. In the current global AI competition, every company trains and operates their models in private, closed source, on content scraped without permission, exploiting and attacking shared, public infrastructure. The tremendous advancements render models trained months ago almost useless. The improvements do not bring more efficiency and less resource use for the whole industry, they merely fuel the race. So far, the AI bubble has not built anything that lasts – not even a return for investors.
It becomes evident that AI’s energy and water hunger is bad for the environment and thus bad for many people. How bad is it compared to fossil capitalism as a whole, which undermines the foundations of human life? What is the relative contribution to the climate catastrophe? Is is 1%, 2% or 3%? In any case, the amount is sizeable. A 1% difference matters just as a 0.1°C global temperature rise matters. We must not let IT companies get away with this.
Apart from the quantity, the specific quality of AI’s environmental impact matters: Due to the global competition between nation states, policymakers let AI firms break energy and environmental rules. These regulations, which were hard fought for, keep people healthy, keep clean water available and critical infrastructure running.
“Individual use of AI is not bad for the environment”, says the mentioned article. If you treat this topic merely as a question of personal consumer responsibility, you twist it beyond recognition. If you recognize it as a societal problem, yes, AI in its current form is bad for the environment. So let us ask: