AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Embrace Your Hyperplanes
All the cess of neural networks seems explained by peace wise linear functions. Randall's spline work makes that bit of philosophical insight brutally clear, in my opinion. relu is by far the dominant activation function, because it stops pretending at anything other than peace wise linear just stick in a flat boundary threshold and a line a neuron puts in a hyperplane then lets the rest of the network chop.