NLP Highlights cover image

67 - GLUE: A Multi-Task Benchmark and Analysis Platform, with Sam Bowman

NLP Highlights

00:00

How to Train a Multi-Task Model

Multi-task models are what gets us our best performance. We found that adding ALMO to anything is giving you improvements in performance consistently across all of these tasks. Pre-training on some subset of these tasks will generally get you decent performance on the rest of them. And probably unsurprisingly, using attention, using large models really helps you. It's a really interesting set of experiments that you ran it sounds like I need to look at more closely.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app