The Gradient: Perspectives on AI cover image

Yann LeCun on his Start in Research and Self-Supervised Learning

The Gradient: Perspectives on AI

00:00

How Do You Get Started With a Neural Net?

I realized pretty early that the reason why the early attempts in the 60s, a neural net had basically withered is because people were looking for running rules for multi-layer networks. And I kind of found one, which we would now call target prop. So it's the idea that you don't backprop a gradient, you backprop a virtual target for every neural essentially. You can derive an algorithm like this that kind of backpropagate targets. But if you make everything continuous inside, so I was using binary neurons, mostly because the computers I had access at the time could not do multiplication if you can. If you have binary neurons, you only need to do additions. It

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app