Super Data Science: ML & AI Podcast with Jon Krohn

747: Technical Intro to Transformers and LLMs, with Kirill Eremenko

10 snips
Jan 9, 2024
Data scientist Kirill Eremenko discusses the basics of transformers and LLMs, emphasizing the five building blocks of transformer architecture and why transformers are so powerful. Topics include AI recruitment, a new course on LLMs, and the impact of LLMs on data science jobs.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ADVICE

Learn LLM Skills Early

  • Acquire LLM skills early to maximize competitiveness in the rapidly growing job market.
  • Salaries for LLM engineers are high now but may normalize as supply catches up with demand.
ADVICE

Study Transformers Deeply

  • Dedicate time to study transformers from diverse sources for thorough understanding.
  • Consider specialized courses compiling detailed research and explanations for efficient learning.
INSIGHT

Five Core Ingredients of LLMs

  • Large Language Models (LLMs) require vast data, transformer architectures, and extensive pre-training.
  • Optional steps include reinforcement learning with human feedback and domain-specific fine-tuning.
Get the Snipd Podcast app to discover more snips from this episode
Get the app