2min chapter

MLOps.community  cover image

ML Scalability Challenges // Waleed Kadous // MLOps Podcast # 154

MLOps.community

CHAPTER

The Complexity of Recurrent Neural Networks

Noah Goodman is doing this really work, which is to apply the techniques of like neuroscience to understand what's actually going on in these 175-minute parameters. It usually takes like 18 months to two years for those problems to kind of be sorted out, depending on the level of complexity and sometimes much larger. So I think we're at the very beginnings of this. The key thing is just to be ready when it comes. This type of revolution doesn't happen very often.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode