80,000 Hours Podcast cover image

#107 – Chris Olah on what the hell is going on inside neural networks

80,000 Hours Podcast

00:00

Understanding Scaling Laws in Neural Networks

This chapter explores the concept of scaling laws in neural networks, particularly how increased model size and computational resources generally lead to improved performance. It highlights the nuances of interpreting log-log graphs and the interconnected relationships between model size, training duration, and data volume. The discussion also emphasizes the implications for forecasting advancements in AI and ensuring alignment with human values as models grow larger.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app