Eye On A.I. cover image

#299 Jacob Buckman: Why the Future of AI Won't Be Built on Transformers

Eye On A.I.

00:00

Can long contexts replace fine-tuning?

Craig asks whether growing context helps; Jacob argues long-context power retention enables more in-context learning, reducing fine-tuning need.

Play episode from 24:28
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app