The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

Mamba, Mamba-2 and Post-Transformer Architectures for Generative AI with Albert Gu - #693

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

NOTE

Efficiency vs. Memory in State and Attention Models

State-based models aim to find the right size state to store all necessary information with minimum compression, allowing control over efficiency and memory. In contrast, attention-based models focus on remembering specific details seen, lacking control over stored information. State size control is a pivotal aspect in state-based models compared to older RNNs, emphasizing the significance of the state in memory and efficiency trade-offs.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner