top of page
Search

AI generation and co2 emission: the daily tool and how it impacts our environment

  • Writer: Karik x10
    Karik x10
  • Jul 25
  • 3 min read
OpenAI GPT logo
OpenAI GPT logo

In recent years, Large Language Models like ChatGPT have grown to become an incredibly popular every day tool people use to help them with day to day work tasks, curiosities or needs for things like visualisation or artwork.

It's practically a part of our lives now, with direct implementations into search engines, apps, even your operating system like Windows Co Pilot.

These tools have paved the way for future work assistants, but they have brought with them several issues. LLM's require intensive training from both public audiences and closed networks to allow them to function at the high capacity they need, as well as to cope handling hundreds of requests all at once. Models like GPT-04 have billions of parameters, and as such they have a huge demand for power, putting a big strain on the electricity grid and causing a large amount of CO2 emissions.


We've even had our own fun with it, experimenting with tools like GPT 4.0 to see what limits they have to generation.

ree

LLM's operate from large data centres that have to be running 24/7 at an extremely high capacity. As you can imagine, these are very demanding, very power-hungry, accounting for approximately 3.7% of global CO2 emissions, which is amplified further during the training phase of LLM production.

The hardware behind them harms our environment too. AI models run heavily off of GPU's (Graphics Processing Units), the hardware relies heavily on mining and processing rare earth metals. Polluting the air and surrounding environments.

Data Farm
Data Farm


Think before you prompt!

Every prompt you get generated, whether that be text, image, or now even video with tools like VEO-3, you are hitting hard against our environment. The servers burn energy, consume water and leave a real-world carbon footprint. These convenient tools have a hidden environmental cost that gets ignored in the face of quick access to image creation, research and writing production.


Per reports from the World Economic Forum in 2024, AI models can help us tackle global issues such as climate change, these tools can track the state of icebergs in Antarctica, help us identify pollution, aid us in reducing the environmental impact from the agriculture industry. But that comes at a cost still, every generated response from AI models burn and burn, polluting our environment more.

A single image generation can use as much as half the energy that goes into charging a phone from dead to 100% battery, LLM training can output the CO2 emissions of five cars and with such a high demand and usage it is projected that by 2027 AI models alone would be responsible for between 4.2 and 6.6 billion cubic meters of water withdrawal.


There is some hope, as ways forward to improve LLM's and make them less impactful on the environment have been identified!

Model optimization is one of the biggest topics as of right now, the question of "How can we make our AI model smaller and more efficient" is the most common. By training more smaller "student" AIs like GPT 3.5 to mimic larger models whilst operating at a smaller capacity, as well as the usage of liquid or immersion cooling to reduce the electricity and water usage from servers.

Data centres being constructed in cooler countries and environments, and reusing older models. These are all things the companies behind LLM's are discussing. But you too can do something.


As earlier mentioned, AI LLM's are a useful tool for everyday work, and there isn't exactly a reason for us to outright refrain from using them. But we can use them more responsibly, the first solution is to batch requests, if you have multiple questions then you should ask them all in the same request. Default your usage to smaller models especially if it's for a more simple task and definitely stop using AI tools to give you responses to personal messages.


In an age of technological advancements, LLM AI have been one of the most impressive tools developed, but with how high the volume of usage is, the environmental impacts all add up. Finding alternative means to power the data centres, to keep the servers running whether that be mitigating the power usage by generating solar energy for it, building centres in areas that wont require them to burn through as much energy or water. Whatever it may be, this wonderful tool needs better backing, so that it becomes more sustainable for our environment. And for our usage.


Written by Karik Childs


ree


 
 
 

Comments


bottom of page