Eye On A.I. cover image

#299 Jacob Buckman: Why the Future of AI Won't Be Built on Transformers

Eye On A.I.

00:00

Why many 'long-context' models underperform

Craig asks about degradation across long contexts; Jacob explains sparse/windowed attention and limited long-context training cause failures.

Play episode from 30:50
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app