Super Data Science: ML & AI Podcast with Jon Krohn cover image

695: NLP with Transformers, feat. Hugging Face's Lewis Tunstall

Super Data Science: ML & AI Podcast with Jon Krohn

00:00

Exploring Transformers in NLP: Encoder-Decoder Structure and Model Evolution

This chapter explores the encoder-decoder structure and the development of models like GPT and BERT in the realm of natural language processing, emphasizing how transformers encode and decode information in an abstract space. It emphasizes the differences between encoder-only models such as BERT, specializing in language understanding, and decoder-only models like GPT, proficient in language generation.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app