The Most Interesting Thing in A.I.

Incremental vs. Exponential - with Nicholas Thompson and Andrew Ng

22 snips
Jan 8, 2025
In this engaging discussion, Andrew Ng, founder of DeepLearning.AI and a key figure in AI, tackles the hot topic of AI's rapid evolution. He contrasts gradual advancements with explosive growth in societal interest and enthusiasm for AI. Ng highlights the potential shift to more human-like AI behaviors and decreasing training costs. He also explores decentralized training through federated learning, the rise of open-source AI, and the competitive landscape between the U.S. and China, showcasing a future ripe with innovation and opportunities.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

AI's Steady Rise

  • Andrew Ng believes AI's rise isn't sudden but rather steady exponential growth over 20 years.
  • Increased societal awareness makes it feel abrupt, coupled with a surge in applications.
INSIGHT

Scaling Laws and Their Limits

  • While scaling laws (bigger models, more data) drove recent progress, their exponential cost limits their long-term viability.
  • Andrew Ng suggests other factors will also drive AI advancement.
INSIGHT

Agentic Workflows

  • Agentic workflows, where AI iteratively refines its work like human writers, are improving AI performance.
  • This approach, unlike scaling, is cost-effective and applicable in various fields.
Get the Snipd Podcast app to discover more snips from this episode
Get the app