Machine Learning Street Talk (MLST) cover image

Want to Understand Neural Networks? Think Elastic Origami! - Prof. Randall Balestriero

Machine Learning Street Talk (MLST)

CHAPTER

Exploring the Complexity of Neural Network Representations

This chapter explores the complexities of neural network training, particularly the interplay between context length and the intrinsic dimensionality of representations. The discussion highlights the limitations of Reinforcement Learning from Human Feedback (RLHF) and the challenges of high-dimensional spaces in enhancing model controllability and fine-tuning.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner