

Pretty Art, Ugly Footprint: Building Sustainable AI
May 14, 2025
Anwesha Sen, a tech policy and AI governance researcher at the Takshashila Institution, joins the discussion to highlight the environmental impact of AI technologies. She delves into the alarming energy demands of data centers, which could consume 10% of global electricity by 2030. The conversation explores the potential of Small Modular Reactors (SMRs) as a sustainable energy source to power growing AI infrastructures. Sen emphasizes the importance of proactive planning and collaborative efforts between sectors to ensure a greener future for AI.
AI Snips
Chapters
Transcript
Episode notes
Why AI Consumes So Much Power
- AI training and inference both demand continuous, large-scale electricity due to huge model sizes and long training times.
- GPUs/TPUs and iterative retraining drive sustained power consumption across thousands of machines.
Data Centers Could Strain Power And Water
- Data centers' electricity share could rise from ~3% today to about 10% by 2030, stressing grids and water resources.
- Cooling needs also create huge local water demand, compounding sustainability concerns.
Reduce Water Use In Data Centers
- Deploy waterless and direct-to-chip cooling to cut fresh water use at data centers.
- Recycle cooling water and adopt liquid cooling to reduce municipal water consumption.