2min chapter

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

Language Modeling With State Space Models with Dan Fu - #630

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

CHAPTER

The Relationship Between Number of States in a State Space Model and Sequence Longevity

So in the age three paper proper, we kind of looked at language modeling at several different model sizes. We actually haven't seen that you need like a particularly large state for longer sequences. Maybe if we scale to hundreds of thousands or millions, the tune will change. But so far, in what we've seen, the state vector size, we use a 64. And that's the one that seems to be working so far.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode