Changelog Master Feed cover image

Large models on CPUs (Practical AI #221)

Changelog Master Feed

00:00

The Importance of Smaller Models in Neural Magic

Everybody's very excited about sparsity specifically, mainly because you can turn these large models and get rid of up to 95, even 97% of the weights are actually useless in these. Obviously, you can use that for a lot of efficiencies around performance and energy. And that's specifically where we've been focusing in at neural magic and what I've beenocusing in on my work. Generally, there's going to be two cases that we're looking at in terms of deployment.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app