AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Can You Recall a Token?
In our paper, we actually found that SSMs could not sort of immediately do that recall. And the reason is a little bit interesting. One way to look at attention is you're going to be kind of making comparisons across your entire sequence. So attention can kind of do it by brute forcing in and kind of saying, okay, I'm going to go look one at a time at each word with a state space model.