3min chapter

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

100x Improvements in Deep Learning Performance with Sparsity, w/ Subutai Ahmad - #562

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

CHAPTER

The Potential of Sparsifying Transformers and Language Models

Transformers are becoming very popular, but at the same time they're consuming huge amounts of power and creating a pretty big negative environmental impact. To day, we've having quite a bit of success in sparsifying transformers. With the right hardware optimizations, 90 percent spart should give you more than a ten x gain. The brain is 95 to 98 % sparse, so if we can get anywhere close to the levels of the brain, we're talking multiple orders of magnitude efficiency gains.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode