The Gradient: Perspectives on AI

Sasha Rush: Building Better NLP Systems

23 snips
Feb 29, 2024
Professor Sasha Rush discusses the importance of learning and inference in AI, state-space models as an alternative to Transformers, efficiency enhancements in NLP systems through techniques like sequence level knowledge distillation, and the evolution of research perspective towards empirical approaches in NLP systems.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Learning Over Inference in NLP

  • Sasha believed complex inference would play a critical role in NLP but underestimated the impact of large data and learning.
  • He didn't foresee how much data would improve systems and reduce the importance of inference.
ADVICE

Encourage Bottom-Up Research Control

  • Sasha advocates for bottom-up control in research rather than top-down funding decisions.
  • He encourages building environments that support diverse ideas and risk-taking in NLP.
INSIGHT

Promise of State Space Models

  • State space models (SSMs) offer mathematical elegance and efficient training compared to transformers.
  • SSMs may overcome key-value caching issues and support longer context lengths.
Get the Snipd Podcast app to discover more snips from this episode
Get the app