The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Environmental Impact of Large-Scale NLP Model Training with Emma Strubell - TWIML Talk #286

Jul 29, 2019
In this discussion, Emma Strubell, a visiting scientist at Facebook AI Research and future professor at Carnegie Mellon, dives into the environmental costs of NLP model training. She reveals findings from her pivotal paper on the carbon emissions linked to deep learning despite accuracy improvements. Emma also discusses how businesses are responding to environmental concerns and the importance of developing efficient, sustainable NLP practices. Her insights bridge cutting-edge research with real-world applications, offering a vision for greener AI.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ANECDOTE

Environmental Cost of NAS

  • Training a machine translation model with neural architecture search had a massive carbon footprint.
  • This was many times larger than a car's lifetime footprint, including manufacturing and fuel.
ADVICE

Report Compute Metrics

  • Report metrics like gigaflops to convergence to enable comparisons of computational costs.
  • This allows for better evaluation of accuracy gains relative to resource consumption.
ADVICE

Report Hyperparameter Sensitivity

  • Report sensitivity to hyperparameters to understand energy use from model tuning.
  • This wasteful process significantly contributes to the overall energy consumption.
Get the Snipd Podcast app to discover more snips from this episode
Get the app