Lex Fridman Podcast cover image

Vladimir Vapnik: Statistical Learning

Lex Fridman Podcast

00:00

Deep Learning as a Network?

Deep learning is often viewed through the lens of complex network architectures, but there are concerns about its effectiveness and the qualifications of those developing these systems. The historical context of post-war decisions illustrates the pitfalls of unqualified leadership, drawing a parallel to today's machine learning landscape where many practitioners lack deep mathematical understanding. This results in interpretations that may mislead the field, such as the reliance on 'deploric' mathematics, which lacks necessity and clarity. There is an emphasis on the need for a rigorous mathematical approach to understanding convergence in deep learning systems rather than relying on vague interpretations of brain function. The optimal solutions to these mathematical problems might be found through shadow networks, calling into question the mainstream focus on traditional deep learning frameworks. A grounded approach towards mathematical problem-solving is advocated over speculative theories that often lack empirical support.

Play episode from 23:09
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app