Data Brew by Databricks cover image

Data Brew Season 2 Episode 4: Hyperparameter and Neural Architecture Search

Data Brew by Databricks

00:00

How to Improve Random Search for Hyperconber Tuning?

Random search for hyperconber tuning is very competitive, and there are situations in which random searched performs almost as well as basian octomization. Early sapking seemed like a really good a technique to combine with random search. And it works really well. So i think when we get to the next section where we talk about nero architecture search, think random search will show up again.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app