
126 - Optimizing Continuous Prompts for Generation, with Lisa Li
NLP Highlights
The Difference Between Manual and GPT-3 Prompts
In contrast with GPT-3 style few-shot learning you still have a prefix here but as opposed to the GPT-2 style manually written from. Unlike GPT-4 style prompts your prompts are continuous right I mean they're not interpretable and don't correspond to specific words in the vocabulary. The most useful prompt setting will we try to optimize it aggressively for one task it's not going to be interpretable or it's going to be a bit strange in the wording can you tell us a little bit more about how is prefix tuning different from conditional generation models like control where we have some kind of as well prefix but like actual word like summarization semicolon?
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.