Data Skeptic cover image

[MINI] Backpropagation

Data Skeptic

00:00

How to Improve the Error for a Total Neural Network

What we really need here is the first derivative. You can calculate the error for the total neural network. It's kind of like imagine if every weight was a little knob and your job was to make a light as bright as possible. That sounds profoundly correct, but I have no idea what it means. Can you ask me through that? What do you mean? Did you ever have one of those grown up Ball maze games? No. Really? Only the things that came in a happy meal.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app