Machine Learning Street Talk (MLST)

Kernels!

Sep 18, 2020
Alex Stenlake, an expert in data puzzles and causal inference, dives into the fascinating world of kernel methods. He shares insights on the evolution of kernels and their crucial role before the rise of deep learning. The discussion reveals the significance of the Representer theorem and positive semi-definite kernels. Alex contrasts traditional techniques like SVMs with modern approaches, highlighting the strengths of kernels in tackling small problems. He also connects kernels to neural networks and touches on their applications in various fields.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ANECDOTE

Forgotten Kernels

  • Tim Scarfe refreshed his kernel knowledge by reviewing Bishop's book, Chapter 6.
  • He admits to forgetting most of it or perhaps not fully understanding it initially.
INSIGHT

Data Points as Models

  • Yannick Kilcher finds the connection between data points and models exciting, particularly in transformers.
  • Increasing data and parameters in transformers achieves similar results to data-centric kernel methods.
INSIGHT

Kernel Teaching Flaw

  • Alex Stenlake and Yannick Kilcher believe kernels are often taught or explained poorly.
  • The shift from statistical learning theory to engineering-focused deep learning contributes to this issue.
Get the Snipd Podcast app to discover more snips from this episode
Get the app