The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

Mamba, Mamba-2 and Post-Transformer Architectures for Generative AI with Albert Gu - #693

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

00:00

Intro

This chapter examines the balance between performance and efficiency in post-transformer architectures, emphasizing memory usage in sequence models. It discusses attention mechanisms, recurrent models, and innovations in generative AI such as Mamba and Mamba 2.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app