
Mamba, Mamba-2 and Post-Transformer Architectures for Generative AI with Albert Gu - #693
The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
Advancements in Sequence Models
This chapter explores the evolution of sequence models in reinforcement learning, examining the benefits of recurrent models versus transformers. It also introduces post-transformer approaches, particularly state-space models, while emphasizing the importance of context compression and the need for diverse architectures tailored to specific data types.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.