Azeem Azhar's Exponential View cover image

Azeem’s Picks: AI’s Near Future with Jürgen Schmidhuber

Azeem Azhar's Exponential View

00:00

The Scale of Recurrent Neural Networks

AI research has a long history of pre-wiring certain things that people thought cannot be learned. In recent decades, however, systems have emerged that need less and less pre-wild knowledge. A large LSTM of today, are the type that is used in Google Translate or in Facebook Translate has maybe 1 billion connections,. Let's quickly compare that to the human brain which has a million times more.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app