NLP Highlights cover image

61 - Neural Text Generation in Stories, with Elizabeth Clark and Yangfeng Ji

NLP Highlights

00:00

How to Generate Entities and Generate the Next Word of a Story

The model that we propose in this paper builds off a C2C baseline, which is all we had used in previous writing tasks. The basic idea is that as you go throughout a text, every time you encounter a entity, there's a vector that's created to represent that entity. And each time it's mentioned, that vector representation is updated. So when it comes time to generate the next word of the story, what you have access to is a collection of vectors,. one for each entity that represents kind of the current state of that entity and the narrative so far.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app