Machine Learning Street Talk (MLST) cover image

Want to Understand Neural Networks? Think Elastic Origami! - Prof. Randall Balestriero

Machine Learning Street Talk (MLST)

00:00

Exploring the Complexity of Neural Network Representations

This chapter explores the complexities of neural network training, particularly the interplay between context length and the intrinsic dimensionality of representations. The discussion highlights the limitations of Reinforcement Learning from Human Feedback (RLHF) and the challenges of high-dimensional spaces in enhancing model controllability and fine-tuning.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app