
Post Reports Talking to ChatGPT drains energy. These other things are worse.
12 snips
Oct 6, 2025 Michael J. Coren, a climate advice columnist known for his insights on responsible AI use, discusses the surprising energy and water costs associated with AI technologies. He explains how a single query to a chatbot can consume significant resources but reassures listeners that AI remains a minor player in overall digital emissions. Coren also emphasizes that everyday activities like commuting and dietary choices have a far greater impact on the environment. He shares practical tips for responsible AI usage, promoting efficiency and awareness.
AI Snips
Chapters
Transcript
Episode notes
Why Chatbots Use Energy
- AI queries consume electricity because GPUs in distant data centers process and return every request.
- Early estimates overstated costs, and modern per-query energy is around 0.24–0.3 watt-hours.
Measured Per‑Query Energy
- Independent analyses estimate a typical AI query uses about 0.3 watt-hours, similar to an LED bulb for two minutes.
- Google reported a median text response energy of roughly 0.24 watt-hours for Gemini.
Model Size Dictates Energy Use
- Models vary widely: huge models use enormous compute while smaller models run on far less power.
- Small language models can perform specific tasks efficiently on phones, reducing reliance on massive models.
