AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Relationship Between Number of States in a State Space Model and Sequence Longevity
So in the age three paper proper, we kind of looked at language modeling at several different model sizes. We actually haven't seen that you need like a particularly large state for longer sequences. Maybe if we scale to hundreds of thousands or millions, the tune will change. But so far, in what we've seen, the state vector size, we use a 64. And that's the one that seems to be working so far.