
Grokking, Generalization Collapse, and the Dynamics of Training Deep Neural Networks with Charles Martin - #734
The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
00:00
Navigating Neural Networks: Grokking and Generalization
This chapter explores the evolution of spiking neural networks and their connection to modern AI, emphasizing the phenomena of grokking and generalization collapse. It discusses the complexities of training deep neural networks, particularly the balance between memorization and generalization, while addressing practical challenges in applying AI in real-world settings. The chapter also highlights the importance of proper data access and compliance in fostering effective data-driven solutions within organizations.
Transcript
Play full episode