80,000 Hours Podcast cover image

#107 – Chris Olah on what the hell is going on inside neural networks

80,000 Hours Podcast

CHAPTER

Understanding Scaling Laws in Neural Networks

This chapter explores the concept of scaling laws in neural networks, particularly how increased model size and computational resources generally lead to improved performance. It highlights the nuances of interpreting log-log graphs and the interconnected relationships between model size, training duration, and data volume. The discussion also emphasizes the implications for forecasting advancements in AI and ensuring alignment with human values as models grow larger.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner