NLP Highlights cover image

96 - Question Answering as an Annotation Format, with Luke Zettlemoyer

NLP Highlights

00:00

The Importance of Pre-Training in Deep Learning

I think there could be a lot more of like really looking for tasks where things fail and trying to get that supervision into these models. What is it that we don't get from, you know, mass language modeling or the new generalizations of it? And those things would be candidates for trying to label a scale or think of another proxy task. Maybe you're lucky, maybe you don't have to label it. Maybe you can find data already. Yeah. Great. This has been really interesting. As a last topic, I think it'd be nice to come back to this contextualization idea and pre-training. How do you think the notion of getting data at scale with natural language

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app