NLP Highlights cover image

68 - Neural models of factuality, with Rachel Rudinger

NLP Highlights

00:00

Multi Task Learning

The data sets aren't huge. It's not that surprising that you're going to get gains by combining the data, right? In this case, looks like it does. I guess another em in talking with people also at a c l we we were wondering how much of the gain of multi task learning is just from seeing more more language. Ye, i think th that's a very real phenomenon, and something that we have seen in other a closely related work on semantic proto roll labeling. We tried a bunch a different, a related semantic aa roll labelling tasks, and nothing worked better than a,. initializing the the bialis t, m, initializing the incode

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app