

Environmental Impact of Large-Scale NLP Model Training with Emma Strubell - TWIML Talk #286
Jul 29, 2019
In this discussion, Emma Strubell, a visiting scientist at Facebook AI Research and future professor at Carnegie Mellon, dives into the environmental costs of NLP model training. She reveals findings from her pivotal paper on the carbon emissions linked to deep learning despite accuracy improvements. Emma also discusses how businesses are responding to environmental concerns and the importance of developing efficient, sustainable NLP practices. Her insights bridge cutting-edge research with real-world applications, offering a vision for greener AI.
AI Snips
Chapters
Transcript
Episode notes
Environmental Cost of NAS
- Training a machine translation model with neural architecture search had a massive carbon footprint.
- This was many times larger than a car's lifetime footprint, including manufacturing and fuel.
Report Compute Metrics
- Report metrics like gigaflops to convergence to enable comparisons of computational costs.
- This allows for better evaluation of accuracy gains relative to resource consumption.
Report Hyperparameter Sensitivity
- Report sensitivity to hyperparameters to understand energy use from model tuning.
- This wasteful process significantly contributes to the overall energy consumption.