

Eco-aware GNN Recommenders
12 snips Aug 30, 2025
Antonio Purificato, a PhD student from Sapienza University of Rome and researcher at Amazon, discusses groundbreaking work in eco-aware graph neural networks. He delves into the environmental costs of traditional recommendation systems and presents innovative methods to model user-item relationships with sustainability in mind. Topics include the Code Carbon framework for monitoring energy consumption and the balance between algorithmic performance and ecological responsibility. Antonio highlights the need for eco-friendly practices in AI as the tech world strides forward.
AI Snips
Chapters
Transcript
Episode notes
Training Dominates Energy Cost
- Training is the main energy cost in ML, often far above inference.
- Larger models require more GPU hours and thus higher CO2 equivalent emissions.
Measure Emissions With CodeCarbon
- Use tools like CodeCarbon to automatically track energy use and CO2 equivalent for experiments.
- Track hardware, runtime, and region-level grid mix to estimate environmental impact.
Embedding Size Impacts Emissions
- Embedding size strongly affects training time and environmental impact.
- Bigger embeddings usually increase parameters, runtime, and CO2, though exceptions exist.