2min chapter

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

100x Improvements in Deep Learning Performance with Sparsity, w/ Subutai Ahmad - #562

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

CHAPTER

Is There a Potential of Deep Learning?

We published a paper called two sparsities are better than one. We showed that when you think about hardware architectures, you actually have to be careful about the sparsity patterns. And so we've come up with some techniques called complimentary sparsity where there's a set of patterns that map really, really officiently to the hardware. If you can do all that stuff well and balance everything well and still get really accurate networks, we've shown you can get two orders of magnitude improvement in performance.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode