NLP Highlights cover image

67 - GLUE: A Multi-Task Benchmark and Analysis Platform, with Sam Bowman

NLP Highlights

00:00

The Future of Reusable Sentence Understanding Tools

In the short term, I think the kind of work that's most exciting is finding better ways to do pre-training. How best to do multitask learning? If maybe you have multiple pre-training objectives that each seems to teach you something different. And sort of how to do transfer learning. Do we want to start with a sort of common model and fine-tune it for each of these tasks? Do you want to share parameters rigidly between these tasks? How do we want to do this?

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app