Robert Wright's Nonzero cover image

How Does AI Work? (Robert Wright & Timothy Nguyen)

Robert Wright's Nonzero

00:00

The Importance of Attention in Language Models

Within words there are semantically significant at least at a probabilistic level sequences of letters just by virtue of. the Latin origins of some language P.A.T. is often going to refer to something male or thought you know father blah blah blah and so on. I suspect it picks up in weird ways on on the on the etymological origins of languages in that sense but I mean who knows the let me let me okay so attention and transformers. So the big how would you characterize the big finding of that paper is it that you want to scan a lot of surrounding tokens in order to. find out what's the meaning in lay person's language of the attention is

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app