
Hungry Hungry Hippos - H3
Deep Papers
Can You Recall a Token?
In our paper, we actually found that SSMs could not sort of immediately do that recall. And the reason is a little bit interesting. One way to look at attention is you're going to be kind of making comparisons across your entire sequence. So attention can kind of do it by brute forcing in and kind of saying, okay, I'm going to go look one at a time at each word with a state space model.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.