AI Today Podcast cover image

AI Today Podcast: AI Glossary Series – Bias, Weight, Activation Function, Convergence, and ReLU

AI Today Podcast

00:00

How to Figure Out Which Inputs Are More Important Than Others

We don't really know in advance which inputs are more important than others based on whatever we're trying to do. We want the computer to figure that out, and we'll talk later about how the computer figures that out. So an activation function is a nonlinear function. Without it, all these connections of neurons just will combine everything together into some, you know, big pudding I don't know. And there's this idea that goes along with it called convergence,. The goal is to get that network and weights adjusted so that when I present it with new information, it gives me the desired results.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app