The Gradient: Perspectives on AI cover image

Hugo Larochelle: Deep Learning as Science

The Gradient: Perspectives on AI

00:00

The Importance of Sparse Coding in Neural Networks

A lot of the motivation was from neuroscience and also at the time you know if you look back at the um greedy layer eyes pre-training paper and a lot of paper in that area no one was using relu's rectified linear units. It was all tanh or maybe sigmoids, I remember when we the idea relu's came up in jeff's lab and also in yashua's lab a bit after or about the same time. So yeah so that particular paper of mine on interdependent neurons like that was very much  very much uh a lot of the inspiration for this work is still there.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app