AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Is Serverless a Good Workflow for Machine Learning?
I think we focused a lot on online inference recently. I think cost is driving a lot of that, the demand for serverless vendors for GPU compute specifically. Other people also use this for things like computational biotech, large scale chess coding,. You know, you can also use it for various types of simulations or back testing that kind of stuff. That being said, I mean, modal going back to its roots. Like we started out focusing a lot on what I think was like embarrassingly parallel problem. So working on a lot of the other stuff like data pipelines, like building more complex support for scheduling and that kind of thing where right now like it's good, but it's not