AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Limits of Neural Tangent Kernels in Deep Learning
The theory of machine learning algorithms is often divided into two main facets. There's questions of sort of training and dynamics things like optimization behavior and conversion slide in this camp And the other side is generalization which says independent of how you got to the final solution How well does it do how well does it generalize from your training data to your test data? So I think people who point out its limitations are correct in noting limitations but at the same time, I think there's a surprising amount we can learn just from studying kernels. We were not super aware that there's an enormous body of literature on this that I've since gone back and integrated. The thing that we noticed fairly quickly was that