2min chapter

MLOps.community  cover image

The Future of Search in the Era of Large Language Models // Saahil Jain // MLOps Podcast #150

MLOps.community

CHAPTER

The Challenges of Using GPUs to Train Models

I think it comes down ultimately to the user experience so what can we do to you know really trace back from there. I think one thing that we have noticed is that GPUs are quite expensive so in general you know when we can use CPUs we do and CPU utilization is important. If you're not careful if you don't account foruser queries that are really long you might end up spikingCPU utilization on those really long queries because your model ends up not batching it correctly or iterating through all of It's a great question but at a high level I would say, "GPS make a lot of sense"

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode