4min chapter

Machine Learning Street Talk (MLST) cover image

#69 DR. THOMAS LUX - Interpolation of Sparse High-Dimensional Data

Machine Learning Street Talk (MLST)

CHAPTER

Using a Smooth Elbow in a Neural Network, Is That Really the Point?

The smooth parts of the activation functions, i think, in the ent paradime of noral networks, are basically just thrown out. They'are ignored. It's not even sot even using that smooth section to any significant degree. What it's doing is exactly what it does with reluse,. which as chop up the space into this honeycomb of honeycombs and effectively take advantage of the linear parts of the activators. So really, anything we do to train an ourl network faster comes down to how did we make the lost landscape nicer? That's one way to view it. We either made the lost landscapes nicer, so more convex, or more of the lost

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode