NLP Highlights cover image

130 - Linking human cognitive patterns to NLP Models, with Lisa Beinborn

NLP Highlights

CHAPTER

Is There a Way to Make It Work?

First words are more important, independent of what's coming after. The problem is that if we go from left to right, then when we mask a token, we can only look at the previous token,. And then if we like ever age over, accumulate over everything, then we automatically have a higher importance of the first tokens in a sentence. Not even if they contribute very little to each, to the contribution of each token, they involve so many times that it ends up high. So i will just say our set up doesn't generalize to this case. I think you can still do it, but you have to find a way to account a bit for this this ordering effect.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner