
Artificial Intelligence & Large Language Models: Oxford Lecture — #35
Manifold
The Dimensionality of the Vector Space of Our Concepts
Theory that our brains are operating in an approximately linear vector space of concepts is kind of interesting. So if you start looking at the vectors which are generated by these models in the field, it's called the embedding vector space. You give it some natural language, it returns to you a vector. And just as a empirical observation, all of these huge language models that we're talking about are using a roughly ten thousand dimensional vector space.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.