Latent Space: The AI Engineer Podcast cover image

The Mathematics of Training LLMs — with Quentin Anthony of Eleuther AI

Latent Space: The AI Engineer Podcast

00:00

Exploring the Computational Dynamics in Deep Learning

This chapter explores the intricate mathematical foundations of compute requirements for training large language models. It emphasizes the computational intensity of the backward pass in deep learning and reveals how the simplicity of equations can obscure their complex underlying proofs.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app