80,000 Hours Podcast cover image

#78 – Danny Hernandez on forecasting and the drivers of AI progress

80,000 Hours Podcast

00:00

Exploring AI Training and Scalability Insights

This chapter examines two pivotal papers on AI training, highlighting the gradient noise scale's role in enhancing the parallelizability of machine learning tasks. Additionally, it explores scaling laws for neural language models, suggesting that larger models offer improved performance and establishing a framework for assessing AI development over time.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app