The Gradient: Perspectives on AI cover image

Yoshua Bengio: The Past, Present, and Future of Deep Learning

The Gradient: Perspectives on AI

00:00

Is There a Curse of Dimensionality?

The curse of dimensionality idea is something that actually arose just a couple of years before in work I did with my brother. The next stage of your work I'd love to talk about is looking at word embeddings from neural networks and neural language models. There is a lot of important work here and a lot of it has to do with trying to bypass the recursive dimensionality. You can take advantage of that similarity between words. And that's the first order thing, then you can have more complicated similarities.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app