Machine Learning Street Talk (MLST) cover image

Want to Understand Neural Networks? Think Elastic Origami! - Prof. Randall Balestriero

Machine Learning Street Talk (MLST)

CHAPTER

Intro

This chapter delves into the training dynamics of neural networks, focusing on the 'grokking' phenomenon where test metrics improve after training accuracy levels off. It highlights the significance of effective input space partitioning and introduces the concept of 'elastorigami' to illustrate how neural networks create complex data boundaries for enhanced model performance.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner