

Is Linguistics Missing from NLP Research? w/ Emily M. Bender - #376 🦜
May 18, 2020
In this engaging discussion, Emily M. Bender, a Professor of Linguistics at the University of Washington, explores the intersection of linguistics and NLP. She challenges the current boundaries of NLP research and emphasizes the importance of linguistic insights in improving language models. The conversation dives into the limitations of models like BERT in grasping true meaning and highlights the ethical implications of language in technology. Bender also advocates for interdisciplinary collaboration to enhance understanding and inclusivity in NLP.
AI Snips
Chapters
Transcript
Episode notes
Linguistics Journey
- Emily Bender's background is in linguistics, starting at UC Berkeley.
- She pursued a PhD at Stanford, focusing on sociolinguistics and syntax.
Computational Syntax
- Bender's syntax work was grounded in computational modeling, using Head-driven Phrase Structure Grammar (HPSG).
- This framework emphasizes detailed, computer-readable representations of grammar.
Syntax: Breadth vs. Depth
- Theoretical syntax focuses on broad generalizations across languages.
- HPSG prioritizes detailed analyses, enabling computational implementation but requiring complex representations.