The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Neural Architecture Search and Google’s New AutoML Zero with Quoc Le - #366

Apr 16, 2020
Quoc Le, a research scientist at Google known for his pioneering work on AutoML Zero and neural architecture search, dives into fascinating topics. He shares insights on using evolutionary methods for optimizing machine learning models and the challenges involved in scaling them. Quoc also discusses semi-supervised learning techniques that enhance data labeling and model performance, and even touches on the humorous side of language models, revealing how they generate unexpected puns and jokes. Tune in for a blend of cutting-edge AI and unexpected humor!
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Compute Constraints in NAS

  • Even with Google's resources, Quoc Le had to wait for faster training times to tackle NAS.
  • He aimed for 1,000-10,000 model trainings, each taking 30 minutes to a few hours on CIFAR.
INSIGHT

Evolutionary Algorithms for NAS

  • Quoc Le's team initially focused on architectural hyperparameters, not learning rate or weight decay.
  • Evolutionary algorithms proved more flexible and better at diversifying models than reinforcement learning for NAS.
INSIGHT

Beyond BatchNorm and ReLU

  • BatchNorm and ReLU are effective but problematic for small batch sizes.
  • Quoc Le's team searched for a replacement layer, finding one superior to BatchNorm and ReLU, applicable to broader research.
Get the Snipd Podcast app to discover more snips from this episode
Get the app