Software Engineering Radio - the podcast for professional software developers cover image

Episode 391: Jeremy Howard on Deep Learning and fast.ai

Software Engineering Radio - the podcast for professional software developers

00:00

Is Your Model Learning the Fastest at the Rate of Learning?

We've already talked a little bit about how things work under the hood inside of these libraries. We you mentioned already, starcastic gradient, dicent and loss functions. You also mention activations, but that's not something that we really defined. Could you explain what activations are? Sure i mentioned thatthe weights, or the values that are in your weight make yeces. Ok, so meaning that your model is improving the fastest at that rate of learning? Yes, exactly. And then i mention the next thing we do as we replace the negatives with zeros, that particular function is called the rectified linear unit function, or relu. Its again, as a

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app