AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Cost of GPUs in Deep Learning Platforms
Smaller companies could be here, but they're costing a lot of money and renting an A100 is about $3 per hour. What would you tell this people? Yeah, no, that's a perfectly valid argument. So if the right solution is to run on CPU, for small scale, that's fine. That makes sense. If you have a reasonable volume of performance, though, like, like, you get to the point very quickly where GPU is actually very cost effective,. You know, there's advantages to having your model updated more frequently.