
#50 Christian Szegedy - Formal Reasoning, Program Synthesis
Machine Learning Street Talk (MLST)
00:00
Working with Batch Normalization and the Importance of the Technique
Batch normalization is a technique that can accelerate certain processes in machine learning./nRemoving batch normalization does not necessarily hinder the functionality of a network, but it can still achieve similar results in some cases./nA recent paper explores training only a fraction of the network weights, specifically the layer norm weights, with similar experimental results./nSharing the weights of the conf net layers and unsharing the batch norm weights can yield comparable results to a network with all the weights./nFine-tuning and utilizing various techniques are common practices in machine learning.
Transcript
Play full episode