The Gradient: Perspectives on AI cover image

Yoshua Bengio: The Past, Present, and Future of Deep Learning

The Gradient: Perspectives on AI

00:00

Is There a Curse of Dimensionality?

The curse of dimensionality idea is something that actually arose just a couple of years before in work I did with my brother. The next stage of your work I'd love to talk about is looking at word embeddings from neural networks and neural language models. There is a lot of important work here and a lot of it has to do with trying to bypass the recursive dimensionality. You can take advantage of that similarity between words. And that's the first order thing, then you can have more complicated similarities.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner