Lex Fridman Podcast cover image

#490 – State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI

Lex Fridman Podcast

00:00

Continual Learning, Memory, and Long Context

They contrast in‑context approaches, continual weight updates, LoRA adapters, context length limits and compaction strategies.

Play episode from 02:57:07
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app