The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

Mamba, Mamba-2 and Post-Transformer Architectures for Generative AI with Albert Gu - #693

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

00:00

Advancements in Sequence Models

This chapter explores the evolution of sequence models in reinforcement learning, examining the benefits of recurrent models versus transformers. It also introduces post-transformer approaches, particularly state-space models, while emphasizing the importance of context compression and the need for diverse architectures tailored to specific data types.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app