The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

Mamba, Mamba-2 and Post-Transformer Architectures for Generative AI with Albert Gu - #693

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

CHAPTER

Intro

This chapter examines the balance between performance and efficiency in post-transformer architectures, emphasizing memory usage in sequence models. It discusses attention mechanisms, recurrent models, and innovations in generative AI such as Mamba and Mamba 2.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner