Machine Learning Street Talk (MLST) cover image

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

Machine Learning Street Talk (MLST)

00:00

Unpacking NLP Training Paradigms

This chapter explores the effectiveness of unsupervised pre-training versus supervised multitask approaches in natural language processing. It discusses the impact of training data diversity on model performance and critiques current meta learning techniques and their applicability to domain adaptation.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app