The Gradient: Perspectives on AI cover image

Peli Grietzer: A Mathematized Philosophy of Literature

The Gradient: Perspectives on AI

00:00

The Cost Function of a Neural Network

The classic vanilla to encoder is sometimes ultimately all you need, says Claire. The bottleneck layer in the middle of a neural network constrain how much information an encoder can like pass from the input to the output. We give it traditionally just like a Euclidean pixel like distance cost function. And we tell it's like, okay, you know what you need to do is to produce the thing. The output that's like as close in pixel space as possible to the input. Claire: One of the big discover big exciting discoveries of early deep learning again was that this process leads to the formation of something like a proto conceptual mapping for data manipulation.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app