

Data Brew Season 2 Episode 4: Hyperparameter and Neural Architecture Search
May 13, 2021
Chapters
Transcript
Episode notes
1 2 3 4 5 6 7 8 9 10 11 12
Introduction
00:00 • 2min
The Future of Machine Learning
02:08 • 3min
Using Hyperpramber Tuning Approaches to Make Better Predictions
04:46 • 3min
Basian Optimization
07:35 • 4min
The Search Space for Hyper-Patient Optimization Is Constrained by the Same Bounds
11:06 • 2min
How Does the Hyperband Approach Actually Differ From Basian Optimization?
12:46 • 2min
The Key Idea Behind Hyperband Is Early Stopping
14:53 • 4min
How to Improve Random Search for Hyperconber Tuning?
18:46 • 2min
Neural Architecture Search
21:04 • 3min
Architecture Search
24:16 • 3min
Noura Architecture Search Is a One and for All Approach
27:29 • 3min
The Journey, Not the Destination, Is Much More Enjoyable
30:37 • 3min