Machine Learning Street Talk (MLST) cover image

#65 Prof. PEDRO DOMINGOS [Unplugged]

Machine Learning Street Talk (MLST)

00:00

Jonathan Frankl: The Role of Sparsity in Neural Networks

Numenta emphasizes sparsity while exploring neural networks and gradient descent. Architectural tweaks improve optimization for gradient descent, but sparsity allows for a more robust representation. However, shoehorning everything for gradient descent is harmful in the long run. Sparsity is crucial for efficient learning.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner
Get the app