Lex Fridman Podcast cover image

Vladimir Vapnik: Statistical Learning

Lex Fridman Podcast

00:00

Deep Learning as a Network?

Deep learning's effectiveness can be scrutinized through the lens of historical decision-making, likening it to the misguided post-World War I negotiations. The criticism is aimed at the reliance on arbitrary architectures and interpretations, suggesting that many practitioners lack a strong mathematical foundation, leading to ineffective solutions. This hinders progress as the true complexity of neural networks remains poorly understood. A reliable approach involves addressing mathematical problems directly, where convergence methods play a crucial role. Insight into deep learning might then emerge from exploring its mathematical underpinnings rather than abstract interpretations of brain functions, highlighting that optimal solutions are better represented through structured frameworks rather than arbitrary models.

Play episode from 23:09
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app