Machine Learning Street Talk (MLST)

New top score on ARC-AGI-2-pub (29.4%) - Jeremy Berman

147 snips
Sep 27, 2025
In this discussion, Jeremy Berman, a research scientist at Reflection AI and recent winner of the ARC-AGI v2 leaderboard, shares his insights on advancing AI reasoning. He advocates for AI systems that can synthesize new knowledge rather than merely memorizing data. Berman explores the limitations of current neural networks, emphasizing the potential of evolutionary program synthesis and natural language approaches. He discusses innovative concepts like knowledge trees and the evolution of AI models capable of true reasoning, pushing boundaries in artificial intelligence.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Reasoning As The Core Meta-Skill

  • Reasoning is the meta-skill that lets a system create new skills from first principles.
  • Jeremy argues teaching models to reason yields broad general intelligence improvements.
ANECDOTE

From Startup CTO To AGI Researcher

  • Jeremy left his CTO role and joined research after reading Jeff Hawkins and exploring language models.
  • He then got recruited to an AGI lab and later joined Reflection AI to work on reasoning and post-training.
INSIGHT

Iterative Evolution Beats One-Shot Generation

  • Evolving many candidate solutions and iteratively revising top ones yields reliable program synthesis.
  • Jeremy found revision loops dramatically improve initial model outputs on ARC tasks.
Get the Snipd Podcast app to discover more snips from this episode
Get the app