AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of Smaller Models in Neural Magic
Everybody's very excited about sparsity specifically, mainly because you can turn these large models and get rid of up to 95, even 97% of the weights are actually useless in these. Obviously, you can use that for a lot of efficiencies around performance and energy. And that's specifically where we've been focusing in at neural magic and what I've beenocusing in on my work. Generally, there's going to be two cases that we're looking at in terms of deployment.