
#2: Deep Learning based Recommender Systems with Even Oldridge
Recsperts - Recommender Systems Experts
The Cost of GPUs in Deep Learning Platforms
Smaller companies could be here, but they're costing a lot of money and renting an A100 is about $3 per hour. What would you tell this people? Yeah, no, that's a perfectly valid argument. So if the right solution is to run on CPU, for small scale, that's fine. That makes sense. If you have a reasonable volume of performance, though, like, like, you get to the point very quickly where GPU is actually very cost effective,. You know, there's advantages to having your model updated more frequently.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.