Machine Learning Street Talk (MLST) cover image

Machine Learning Street Talk (MLST)

#031 WE GOT ACCESS TO GPT-3! (With Gary Marcus, Walid Saba and Connor Leahy)

Nov 28, 2020
This conversation features Gary Marcus, a psychology and neuroscience professor, known for critiquing deep learning, alongside Waleed Sabah, an expert in natural language understanding, and Connor Leahy, a proponent of large language models. They dive into GPT-3's strengths and weaknesses, the philosophical implications of AI creativity, and the importance of integrating reasoning with pattern recognition. The dialogue also critiques AI's limitations in understanding language and explores future possibilities for achieving true artificial general intelligence.
02:44:06

Podcast summary created with Snipd AI

Quick takeaways

  • GPT-3 excels in text generation but struggles with structured tasks like math due to its training data limitations.
  • GPT-3 creates illusions of comprehension without deep understanding, lacking reasoning abilities and facing challenges in context comprehension.

Deep dives

Insights on GPT-3 and Language Models

GPT-3 is praised for its scalability and versatility, showing improved performance with larger models. It excels at text generation and poetry but falls short in structured tasks like math. The training data lacks tabular information, potentially affecting its understanding. Interesting findings reveal that it chooses the first post on forums, showing limitations in data ingestion.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner