Machine Learning Street Talk (MLST) cover image

#77 - Vitaliy Chiley (Cerebras)

Machine Learning Street Talk (MLST)

00:00

Exploring Neural Training Paradigms

This chapter examines sparse and dense training in machine learning through the wriggle algorithm, addressing the challenges of navigating high-dimensional parameter spaces. It discusses the management of weights based on their gradients and the implications of inductive priors in neural network efficiency. The conversation also critiques traditional mathematical methods, promoting innovative architectures and the potential for neural networks to exceed human computational limitations.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app