NLP Highlights cover image

61 - Neural Text Generation in Stories, with Elizabeth Clark and Yangfeng Ji

NLP Highlights

00:00

How to Train a Softmax Model

In this work, we actually acquired the training data to have the core reference information annotated. And so when it comes time to generate the next word, the model makes a couple of decisions. It decides first should the next word refer to an entity. If so, which entity should not refer to, how many words should be in the mention and things like that. Based on that decision, so if it decides the next word should refer to entity A, then it takes entity A's representation at the current time and uses that as another form of context as a generation. So it is a hard decision.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app