NLP Highlights cover image

126 - Optimizing Continuous Prompts for Generation, with Lisa Li

NLP Highlights

00:00

The Differences Between Prefix Tuning and Adapter Tuning

The takeaway is that prefix tuning kind of extrapolates to all these three data sets and work well with a light range of domains perhaps you have already mentioned it can you reiterate why prefix tuning with the same number of parameters as the adapter is performing a little bit better. We think this gain in parameter efficiency is because prefixing keeps the pre-trained language model intact as much as possible more so than adapter tuning. So we don't really have the detailed comparison or the theoretical analysis of how this different inductive or this different architecture design leads to different performance Daddy thank you for your time.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app