Skip to main content

Cookie Consent

We use cookies to enhance your browsing experience, serve personalised ads or content, and analyse our traffic. Learn more

AI in ASIA
AI energy consumption
Life

The Shocking Truth: How AI and ChatGPT Are Guzzling Our Energy

AI power usage, energy consumption of AI, AI's energy demand, AI and energy, AI's environmental impact

Intelligence Desk3 min read

AI Snapshot

The TL;DR: what matters, fast.

AI, exemplified by ChatGPT, consumes significant electricity, with ChatGPT alone using 500,000 kilowatt-hours daily.

Integrating generative AI into Google searches could lead to 29 billion kilowatt-hours consumed annually, exceeding the energy use of some countries.

The AI sector's total energy consumption is projected to reach 85 to 134 terawatt-hours annually by 2027, highlighting a growing concern for sustainability.

Who should pay attention: Policymakers | Technologists | Environmentalists

What changes next: Debate is likely to intensify regarding AI’s environmental impact.

TL;DR:

ChatGPT consumes half a million kilowatt-hours daily, enough to power 17,000 US households for a day.,If Google integrates generative AI into every search, it could consume 29 billion kilowatt-hours annually, surpassing the energy consumption of entire countries.,The AI sector could be using 85 to 134 terawatt hours annually by 2027, potentially reaching half a percent of global electricity consumption.

The Dark Side of AI: A Looming Energy Crisis

Artificial intelligence (AI) has become the defining technology of the 21st century, with its use becoming almost ubiquitous in recent years. However, there is a dangerous hidden cost to its widespread adoption – its insatiable hunger for electricity.

ChatGPT: The Energy-Hungry Chatbot

ChatGPT, the world's most popular AI chatbot, consumes an incredible half a million kilowatt-hours daily to handle its 200 million user requests, according to the New Yorker. This amounts to powering an average US household for 46.5 years or 17,000 households for a single day. For more on how AI is changing our daily lives, see how ChatGPT's 'Buy It' Button Is Quietly Rewriting Online Shopping.

Google's AI Ambitions: A Recipe for Disaster?

The situation becomes more alarming when considering the wider adoption of AI technology. Alex de Vries, a data scientist for the Dutch National Bank, published a study in the journal Joule suggesting that if Google integrated generative AI into every search, it could consume a mind-boggling 29 billion kilowatt-hours annually. This surpasses the yearly energy consumption of entire countries like Kenya, Guatemala, and Croatia. This also impacts how Google AI Overviews (with ads!) coming to APAC are designed.

The AI Sector: A Growing Energy Drain

Estimating the total energy consumption of the AI industry is challenging due to the varying operational needs of large models and the secrecy surrounding tech giants' energy usage. However, de Vries, using data from chipmaker Nvidia, has come up with a projection. By 2027, the entire AI sector could be using a staggering 85 to 134 terawatt hours annually, potentially reaching half a percent of global electricity consumption. This problem highlights the broader issue of Running Out of Data: The Strange Problem Behind AI's Next Bottleneck. For a deeper dive into the environmental impact of AI, consider this report by the AI Now Institute on AI's environmental costs.

Addressing AI's Energy Consumption: The Path to Sustainability

As AI development continues, addressing its energy consumption will be crucial to ensure a sustainable future. Tech companies must prioritize energy efficiency and invest in renewable energy sources to power their AI systems. Governments and regulatory bodies should also play a role in setting energy consumption standards and promoting transparency in the industry. This is particularly relevant in regions like North Asia: Diverse Models of Structured Governance where regulatory frameworks are rapidly evolving.

Comment and Share

What are your thoughts on the energy consumption of AI and its potential impact on the environment? Do you think tech companies are doing enough to address this issue? Share your opinions in the comments section below and don't forget to Subscribe to our newsletter for updates on AI and AGI developments.

What did you think?

Written by

Share your thoughts

Join 4 readers in the discussion below

This is a developing story

We're tracking this across Asia-Pacific and may update with new developments, follow-ups and regional context.

This article is part of the Prompt Engineering Mastery learning path.

Continue the path →

Liked this? There's more.

Join our weekly newsletter for the latest AI news, tools, and insights from across Asia. Free, no spam, unsubscribe anytime.

Latest Comments (4)

Harry Wilson
Harry Wilson@harryw
AI
8 January 2026

The 29 billion kWh figure for Google integrating generative AI into every search is wild. I wonder how much of that is attributed to inference versus the initial training of those models.

Rachel Foo
Rachel Foo@rachelf
AI
31 August 2024

yeah we're looking at a new AI model for fraud detection and compliance is already asking about the server racks. "what's the carbon footprint Rachel?" they ask. meanwhile, our existing systems probably consume more than Kenya already lol.

Arjun Mehta
Arjun Mehta@arjunm
AI
31 August 2024

Actually, Nvidia's Hopper H100s are insanely efficient per calculation. It's the scale and the massive training runs, not the individual chips, that drives this consumption.

Marcus Thompson
Marcus Thompson@marcust
AI
20 July 2024

We just started trialing some AI tools for internal dev work and the pushback from our infra team on potential power draw was immediate. 29 billion kWh for Google search... that really makes you think about scale.

Leave a Comment

Your email will not be published