MLOps.community  cover image

ML Scalability Challenges // Waleed Kadous // MLOps Podcast # 154

MLOps.community

00:00

The Complexity of Recurrent Neural Networks

Noah Goodman is doing this really work, which is to apply the techniques of like neuroscience to understand what's actually going on in these 175-minute parameters. It usually takes like 18 months to two years for those problems to kind of be sorted out, depending on the level of complexity and sometimes much larger. So I think we're at the very beginnings of this. The key thing is just to be ready when it comes. This type of revolution doesn't happen very often.

Play episode from 34:06
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app