Artificial intelligence (AI), in particular its trendiest flavour, called generative AI, is having an increasing impact on the world’s environment. The large language models (LLMs) that are the basis of generative AI require prodigious computing resources. An LLM needs to be trained on data and, once up and running, then produces answers to user prompts. It turns out that, to be effective, an LLM needs to have a lot of parameters (a measure of the model’s ability to learn) and a lot of training data. OpenAI’s ChatGPT 4 model has over 1.7 trillion parameters, ten times that of its predecessor ChatGPT 3.5. ChatGPT 4 was trained on 45 TB of compressed plaintext, which became around 570 GB after filtering. The training of this model is estimated to have cost over $100 million, according to OpenAI’s CEO Sam Altman. This training phase alone is likely to have consumed around 25 megawatts of electricity for three months, which is roughly what 20,000 houses in the US cost to run over a whole year. An LLM uses a lot more power than a traditional task-specific computer program, costing around 33 times more.
Then there are the ongoing costs of processing each user query. A single query costs between less than a cent to a few US cents, which does not sound much until you consider just how many queries are being made– over 2 billion a day in August 2025. This means that just for ChatGPT, there is an ongoing processing cost of perhaps $1 million per day, and rising. That, of course, is just one chatbot, albeit the most popular. There are also rivals like Claude, Grok, Gemini, LLAMA and Perplexity, as well as image-generating LLMs like Midjourney, Stable Diffusion, Leonardo and Adobe Firefly. Image generating LLMs take up much more computing power than text based ones.
This increasing demand for processing power has had a number of effects. Firstly, it has been a boon for chip maker Nvidia, whose graphics processing units (GPUs) were originally designed to power video games, but turned out to be perfectly suited to powering LLMs. Up until the end of 2022 the Nvidia share price was around $15, but by August 2025 it was trading at $180, making it the most valuable company in the world, with a market capitalisation of $4.4 trillion. All those chips have to run somewhere, so there has been a boom in data centres. The global data centre industry is estimated to be worth $243 billion today and may well double by 2032 (some industry estimates are higher). Data centres currently consume up to 1.5% of global electricity (around 300 terawatt hours), and this is rising by the day. By comparison, the entire country of Spain consumes 249 terawatt hours annually. McKinsey estimates that 70% of this growth is driven by AI. The vacancy rate for data centres in Northern Virginia, a popular hub for such facilities, was just 1% in 2024.
This rise in demand for data centres in turn puts a strain on the energy industry, which needs to supply them with electricity. The AI boom may revive the nuclear industry, according to Goldman Sachs. Renewables can meet some of the new demand, but data centres need a consistent supply, and the wind and sun are less consistent than nuclear power. A further issue is the demand for water, which is used to cool the servers in data centres. Water is a scarce resource in some countries. Even in relatively soggy Britain, there are concerns that the extra demand for water by data centres could cause water shortages. The water used by data centres needs to be clean in order for it not to affect the machinery with contaminants, so data centres are draining drinking water. One data centre in Ohio used 6% of the entire district’s water supply in a month when OpenAI were training ChatGPT. The power boom fuelled by AI is also affecting carbon emissions, a major driver of climate change. Google’s greenhouse gas emissions were up 48% in 2023 compared to 2019.
There are some industry responses to this pressure. Some researchers have shown that smaller LLMs can actually perform very well if they are cleverly engineered. One example is the DeepSeek model from China, which performed well on benchmark tests against other LLMS despite its model, at 671 billion parameters, but with only 37 billion activated for any task, being a fraction of the size of ChatGPT. Another approach is the hierarchical reasoning model unveiled by Singaporean company Sapient Intelligence. This model uses small amounts of training data, yet it outperformed both OpenAI and Anthropic’s LLMs on the ARC-AGI benchmark test, which is a sort of IQ test for LLMs. It is also likely that data centres themselves will become more environmentally efficient over time, with more energy-efficient servers. It should also be said that AI models may themselves be used for good in terms of climate change, optimising smart energy grids, and helping to monitor and drive efficiency in industrial systems.
Nonetheless, as LLMs become more advanced and grow in size, and they become more popular in terms of usage, their demands on the environment will increase. Advances in smaller LLM models may help, but these are at the fringes of research so far compared to mainstream LLMs. It is likely that the net effect of the rise of LLMs will be a substantial increase in demand for electricity and clean water. Governments and corporations need to consider the effect that this will have on carbon emissions and other environmental impacts. The advent of generative AI brings much promise in terms of potential applications across a range of industries, but this needs to be balanced with its effect on our environment.