Practical AI

🤗 All things transformers with Hugging Face

Jul 27, 2020
Sasha Rush, an associate professor at Cornell Tech and a contributor at Hugging Face, shares insights on the evolution of natural language processing. He delves into the significance of the Transformers library and its impact on research. The conversation highlights the transition of AI conferences to virtual formats and the challenges of maintaining accessibility and innovation. Sasha also discusses the role of open source in shaping the AI landscape, emphasizing the importance of community collaboration and the future potential of NLP.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ANECDOTE

Hugging Face's Pivot

  • Hugging Face pivoted from chatbots to transformers, creating the definitive library.
  • This transition was observed by Sasha Rush before he joined them.
INSIGHT

Transformer Architecture

  • Transformers use attention mechanisms, not recurrent connections, to process sequential data.
  • This allows random access to past information, improving training speed and scalability.
INSIGHT

Attention Mechanism

  • Attention in transformers is a weighted averaging mechanism.
  • It assigns probabilities to different parts of the input sequence, allowing the model to focus on relevant information.
Get the Snipd Podcast app to discover more snips from this episode
Get the app