NLP Highlights cover image

61 - Neural Text Generation in Stories, with Elizabeth Clark and Yangfeng Ji

NLP Highlights

00:00

Perplexity in Text Generation

The system was inspired by the prayer work, people working on like the referring expression generation. In this case, it is not a language model, you value is like the whole text, including the information of the other text. And so in that sense, using perplexity seems like a reasonable evaluation metric for this kind of system. But I think I disagree with you, uh, here,. Matt, I think perplexity is best we can do if we have no clue what the language model is going to be used for.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app