AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Using Models to Scale Parallel Workflows
Right now we're targeting what I call what I think of as like embarrassingly parallel workloads. Those could be things like I have 100 million images and I need a compute embeddings for those. And then another, like a smaller bucket of things tend to be just like simple chronicles. So that type of model exclusively focuses on code and everything right now,. Like eventually we'll probably answer more likely language to the benefit of focusing on data teams specifically.