Super Data Science: ML & AI Podcast with Jon Krohn cover image

695: NLP with Transformers, feat. Hugging Face's Lewis Tunstall

Super Data Science: ML & AI Podcast with Jon Krohn

00:00

Exploring Transformers in NLP: Encoder-Decoder Structure and Model Evolution

This chapter explores the encoder-decoder structure and the development of models like GPT and BERT in the realm of natural language processing, emphasizing how transformers encode and decode information in an abstract space. It emphasizes the differences between encoder-only models such as BERT, specializing in language understanding, and decoder-only models like GPT, proficient in language generation.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner
Get the app