Machine Learning Guide cover image

MLG 028 Hyperparameters 2

Machine Learning Guide

00:00

Deep Reenforcement Learning - How to Scale Your Neural Network

Batch normalization allows you to put a scaling layer between all your hidden layers that will keep the numbers on track. You can bake the scaling process into your neural network using something called batch normalization, which is more commonly used in deep learning. Grid search, random search, or basian optimization are some of the other hyper parometers we mentioned in the prior episode. See you next time with deep reinforcement learning.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app