Economics Matters with Laurence Kotlikoff

Seth Benzell is Back to Update Us on All things AI

Nov 14, 2025
Seth Benzell, an Assistant Professor at Chapman University and Digital Fellow at MIT and Stanford, discusses the complex economics surrounding AI. He delves into the scaling law and how double descent impacts model performance, shedding light on why investors back loss-making companies expecting future gains. Seth also evaluates the considerable financial savings and job replacement potential of AI, while cautioning against overconfidence in immediate breakthroughs. The conversation unpacks AI's implications on labor and the economy, offering keen insights into its future.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Scaling Law Drives Massive AI Bets

  • The scaling law implies model performance improves predictably as compute, parameters, and data scale up.
  • Firms invest trillions betting bigger models will yield qualitatively superior, economy-changing AI.
INSIGHT

Overparameterization And Double Descent

  • Modern LLMs operate in an overparameterized regime with far more parameters than training tokens.
  • Double descent lets models improve again after extreme overfitting, enabling surprisingly strong generalization.
INSIGHT

The Bitter Lesson: Scale Outperforms Custom Work

  • The 'bitter lesson' says generic compute-scaling beats bespoke, task-specific engineering.
  • Moore's Law rewards scalable approaches, so general-purpose models often outcompete domain-tuned solutions.
Get the Snipd Podcast app to discover more snips from this episode
Get the app