Linear Digressions cover image

Backpropagation

Linear Digressions

00:00

The Magic of Backpropagation

Backpropagation is sort of taking the errors and then propagating them backwards through your neural net. And you asymptotically, hopefully, if it's converging, it will approach the set of weights that are optimized over the whole neural net. These days, they're actually, they're neural nets are the state of the art for things like image recognition, also speech recognition, things like that.

Play episode from 08:57
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app