Machine Learning Street Talk (MLST)

NLP is not NLU and GPT-3 - Walid Saba

Nov 4, 2020
Walid Saba, an expert in natural language understanding and co-founder of Ontologic, brings a wealth of knowledge to the table. He challenges conventional views on deep learning, arguing that the missing ontology is a critical issue in NLU. Their conversation dives into the limitations of models like GPT-3, emphasizing the need for contextual knowledge rather than just data memorization. Saba critiques existing evaluation methods, advocating for a deeper understanding of language that goes beyond technical applications, highlighting the complex interplay of reasoning, intention, and human cognition.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Language Models as UFOs

  • Walid Saba believes that language models are like UFOs: often talked about, rarely seen performing as claimed.
  • He questions the effectiveness of current language models and their understanding of language.
INSIGHT

Intention vs. Extension

  • Walid Saba emphasizes the importance of "intention" in logic and language understanding.
  • He argues that elements can be equal in value (extension) but differ in properties (intention), crucial for language.
INSIGHT

The Knowledge Bottleneck

  • The knowledge acquisition bottleneck was a key driver behind the shift to data-driven AI in the 1990s.
  • Walid Saba questions if the knowledge bottleneck was overreacted to, considering current data scientists' struggles.
Get the Snipd Podcast app to discover more snips from this episode
Get the app