
Dwarkesh Podcast Adam Marblestone – AI is missing something fundamental about the brain
1179 snips
Dec 30, 2025 Adam Marblestone, CEO of Convergent Research and former Google DeepMind scientist, delves into the fascinating intersection of neuroscience and AI. He explains how the brain learns efficiently from minimal data and argues that the secret lies in complex reward functions shaped by evolution. Marblestone highlights how the brain’s predictive cortex contrasts with AI's next-token predictions, and discusses the cellular diversity linked to specific behaviors. The conversation also touches on the implications of this understanding for improving AI systems, including using insights from AI to unlock neural mechanisms.
AI Snips
Chapters
Books
Transcript
Episode notes
Evolution Encodes Learning Curricula
- Evolution likely encodes complex, staged loss functions that bootstrap learning rather than detailed weights or massive datasets.
- These built-in cost functions can act like Python curricula to make learning highly sample-efficient.
Cortex As Omnidirectional Predictor
- The cortex might perform omnidirectional prediction, inferring any subset of variables from any other subset rather than only next-token prediction.
- This supports flexible conditional sampling and could explain richer generalization than current LLMs.
Learning Predicts Steering To Wire Rewards
- Steve Behrens's steering vs learning subsystem idea explains how innate reward circuitry can be linked to arbitrarily learned concepts.
- The learning subsystem trains predictors of steering signals so learned features wire into innate rewards.




