Eye On A.I. cover image

#299 Jacob Buckman: Why the Future of AI Won't Be Built on Transformers

Eye On A.I.

00:00

How state-space/retention models differ from transformers

Craig asks about state-space models and Mamba; Jacob explains retention models, recurrence/attention duality, and chunked formulation.

Play episode from 07:35
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app