NLP Highlights cover image

126 - Optimizing Continuous Prompts for Generation, with Lisa Li

NLP Highlights

CHAPTER

The Importance of Prefix Tuning in Synchronization Tasks

In tables to text the encoder is quite that doesn't have much burden because it will just figure out the right way to encode all the table context and then the decoder will be in charge of grouping them in a natural way. So if we find to in the encoder it leads to a performance improvement in the in the summarization setting means that when we require the encoder to perform a complex task such as extracting important information prefixing is not that effective or prefixing is kind of the bottleneck that that leads to performance drop. Do you know whether they also had this evaluation setting where they tested the generalization of this approach and whether like combining fine tuning and prefix tuning also gives us

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner