
#490 – State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI
Lex Fridman Podcast
Defining Pre‑, Mid‑ and Post‑Training
They define pre‑training, mid‑training, post‑training, synthetic data, and how each stage shapes capabilities and costs.
Nathan Lambert and Sebastian Raschka are machine learning researchers, engineers, and educators. Nathan is the post-training lead at the Allen Institute for AI (Ai2) and the author of The RLHF Book. Sebastian Raschka is the author of Build a Large Language Model (From Scratch) and Build a Reasoning Model (From Scratch).
Thank you for listening ❤ Check out our sponsors: https://lexfridman.com/sponsors/ep490-sc
See below for timestamps, transcript, and to give feedback, submit questions, contact Lex, etc.
Transcript:
https://lexfridman.com/ai-sota-2026-transcript
CONTACT LEX:
Feedback – give feedback to Lex: https://lexfridman.com/survey
AMA – submit questions, videos or call-in: https://lexfridman.com/ama
Hiring – join our team: https://lexfridman.com/hiring
Other – other ways to get in touch: https://lexfridman.com/contact
SPONSORS:
To support this podcast, check out our sponsors & get discounts:
Box: Intelligent content management platform.
Go to https://box.com/ai
Quo: Phone system (calls, texts, contacts) for businesses.
Go to https://quo.com/lex
UPLIFT Desk: Standing desks and office ergonomics.
Go to https://upliftdesk.com/lex
Fin: AI agent for customer service.
Go to https://fin.ai/lex
Shopify: Sell stuff online.
Go to https://shopify.com/lex
CodeRabbit: AI-powered code reviews.
Go to https://coderabbit.ai/lex
LMNT: Zero-sugar electrolyte drink mix.
Go to https://drinkLMNT.com/lex
Perplexity: AI-powered answer engine.
Go to https://perplexity.ai/
OUTLINE:
(00:00) – Introduction
(01:39) – Sponsors, Comments, and Reflections
(16:29) – China vs US: Who wins the AI race?
(25:11) – ChatGPT vs Claude vs Gemini vs Grok: Who is winning?
(36:11) – Best AI for coding
(43:02) – Open Source vs Closed Source LLMs
(54:41) – Transformers: Evolution of LLMs since 2019
(1:02:38) – AI Scaling Laws: Are they dead or still holding?
(1:18:45) – How AI is trained: Pre-training, Mid-training, and Post-training
(1:51:51) – Post-training explained: Exciting new research directions in LLMs
(2:12:43) – Advice for beginners on how to get into AI development & research
(2:35:36) – Work culture in AI (72+ hour weeks)
(2:39:22) – Silicon Valley bubble
(2:43:19) – Text diffusion models and other new research directions
(2:49:01) – Tool use
(2:53:17) – Continual learning
(2:58:39) – Long context
(3:04:54) – Robotics
(3:14:04) – Timeline to AGI
(3:21:20) – Will AI replace programmers?
(3:39:51) – Is the dream of AGI dying?
(3:46:40) – How AI will make money?
(3:51:02) – Big acquisitions in 2026
(3:55:34) – Future of OpenAI, Anthropic, Google DeepMind, xAI, Meta
(4:08:08) – Manhattan Project for AI
(4:14:42) – Future of NVIDIA, GPUs, and AI compute clusters
(4:22:48) – Future of human civilization


