Machine Learning Street Talk (MLST) cover image

Want to Understand Neural Networks? Think Elastic Origami! - Prof. Randall Balestriero

Machine Learning Street Talk (MLST)

00:00

Intro

This chapter delves into the training dynamics of neural networks, focusing on the 'grokking' phenomenon where test metrics improve after training accuracy levels off. It highlights the significance of effective input space partitioning and introduces the concept of 'elastorigami' to illustrate how neural networks create complex data boundaries for enhanced model performance.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app