What is the true impact of your AI queries?

What is the true impact of your AI queries?

It seems like everyone and their mother is talking about AI these days, especially large language models (or LLMs) like OpenAI’s ChatGPT or the newly launched DeepSeek. But while asking an AI chatbot might be the new “just Google it” — and even Google has augmented its search engine with AI — no one seems to know exactly how much energy it actually takes to come up with those neatly worded results that we’re now all too familiar with.

While companies like Microsoft, Meta and OpenAI elect not to share relevant information on the cost of machine learning, we do know that training these models consumes more electricity than other more traditional data center activities. For example, training an LLM like GPT-3 has been estimated to use under 1,300 megawatt hours (MWh) of electricity, which is roughly equivalent to the power consumed annually by 130 US homes. In comparison, streaming Netflix for one hour requires only 0.0008 MWh of electricity — meaning you’d need to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.1

But that’s just training. What about deployment and usage? According to Jesse Dodge, Senior Research Analyst at the Allen Institute for AI, every single question we ask an AI chatbot is routed to a data center,2 and that “one query to ChatGPT uses approximately as much electricity as could light one lightbulb for about 20 minutes.”2 This is far more energy compared to other online activities, such as posting on social media or storing photos in the cloud. Naturally, the increase in the number of people using AI has resulted in an increase in the growth of data centers as well. According to Bloomberg, the number of data centers around the world has nearly doubled within the last 10 years, and in total, they consume as much electricity per year as the entirety of Italy.2

As someone who grew up hearing about the threat of climate change brought about by emissions, this is a deeply concerning issue. Not only are we yet to fully understand the capabilities and implications of AI, we also have not paused to consider its very real environmental impacts. We’re rapidly embracing and integrating this fairly new technology into our daily lives without being fully aware or informed of the true cost — and perhaps it will be some time before we realise the full extent.

So the next time you’re tempted to ask ChatGPT about something you could just as easily ask Google (or the person sitting next to you), I implore you to think twice about the true environmental impact of your query. I know I will; there’s a big chance we can get to the answer the old-fashioned way anyway, without contributing to putting our planet further at risk. I might even phone a friend.

Source

1 Vincent, James. “How much electricity does AI consume?” The Verge, 16 Feb. 2024,
https://www.theverge.com/24066646/ai-electricity-energy-watts-generative-consumption

2 Kerr, Dana. “Artificial intelligence’s thirst for electricity” NPR, 10 Jul. 2024,
https://www.npr.org/2024/07/10/nx-s1-5028558/artificial-intelligences-thirst-for-electricity

Louise Tam