Astral Codex Ten Podcast cover image

Sam Altman Wants $7 Trillion

Astral Codex Ten Podcast

00:00

Analyzing the Computing Power, Energy Requirements, and Training Data for Future AI Models

This chapter explores the computing power, energy requirements, and training data needed to train future iterations of GPT, an AI model developed by OpenAI. It discusses the rapid growth of computing capacity, estimates the number of computers and energy needed for training, and explores the challenges of obtaining enough training data and the potential use of synthetic data.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app