The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch

20VC: Cohere's Chief Scientist on Why Scaling Laws Will Continue | Whether You Can Buy Success in AI with Talent Acquisitions | The Future of Synthetic Data & What It Means for Models | Why AI Coding is Akin to Image Generation in 2015 with Joelle Pineau

15 snips
Nov 3, 2025
Joelle Pineau, Chief Scientist at Cohere and former AI Research VP at Meta, dives deep into AI's future. She discusses the ongoing relevance of reinforcement learning, and the efficiency challenges it faces. Joelle examines the rising costs of specialized data and stresses the importance of balancing compute and talent for innovation. She likens today's AI coding landscape to early image generation, anticipating major advancements. With insights on integrating AI in enterprise and practical approaches to responsible development, Joelle offers a pragmatic yet optimistic view of AI's trajectory.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
00:00 / 00:00

RL Is Fundamental But Inefficient

  • Reinforcement learning is conceptually fundamental but currently highly inefficient for many real-world tasks.
  • Joelle Pineau stresses we need advances in learning efficiency and better simulators to make RL practical.
00:00 / 00:00

Prioritize Algorithmic Breakthroughs

  • Expect linear gains from more compute and data but nonlinear jumps from algorithmic breakthroughs.
  • Invest in algorithmic research because new ideas often produce step-function improvements.
00:00 / 00:00

Scaling Laws Remain Robust

  • Scaling laws have shown robust, predictable improvements when scaled.
  • Joelle Pineau would not bet against scaling laws continuing to contribute to progress.
Get the Snipd Podcast app to discover more snips from this episode
Get the app