AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Is Regularization Important in the Learning Algorithm?
How important is explicit regularization when it could be done implicitly in the way that you train your model? I don't really expect the brain to have a little l two and l one penalty built into the lost functionrigt but it's got to be sort of implicit in there. Back propagation is the only game in town in machine learning, am,. But it's not obvious that that's what the game is, that the brain plays, right? So anyway, unless the biological copiter can do the gradient very falsly then we can think of different spetial things.