Robert Wright's Nonzero cover image

How Does AI Work? (Robert Wright & Timothy Nguyen)

Robert Wright's Nonzero

00:00

The Importance of Context Dependent Word Embedding

Unlike words effect where each word had a single vector. In this transfer model, there are many vectors corresponding to each work. And so you have now billions of parameters operating on these words which live in a thousand dimensional space it is very mind boggling and try to understand. So, I would be lying if I said I understand all this but let me ask you this. If you were to map to getting back this back this idea of kind of mapping words in semantic space or something with all three of these variables be important for that mapping. Yeah, but I say this is where interpretability becomes very hard because unlike words effect where every word has a single Vector. It's the whole

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app