Lex Fridman Podcast cover image

Vladimir Vapnik: Statistical Learning

Lex Fridman Podcast

00:00

Deep Learning Is Not a Very Difficult Problem

Deep learning has shown that it can outperform the best human players despite not being fully understood mathematically, highlighting it as not a particularly difficult problem. Empirical evidence suggests that deep learning methodologies do not necessarily require large datasets; significantly fewer training samples can still yield effective models. Some problems that deep learning methods struggle with may not need them at all. The formulation of deep architectures relies on creating a suitable set of admissible functions, which should stem from the data rather than purely from theoretical speculation. For effective training, applying the law of large numbers is crucial, but a robust strategy can decrease the data requirements greatly, achieving convergence more efficiently.

Play episode from 28:33
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app