Changelog Master Feed cover image

Serverless GPUs (Practical AI #211)

Changelog Master Feed

00:00

ML Model Inference - What's the Difference?

It's a very interesting mesh of skills, it seems, to do what you're doing there. It also kind of brings different cultures together in terms of the choices of languages and stuff. Do you tend to go with one language for everything for simplicity sake? Or do you tend to going with different languages that are catered towards specific use cases? How do you take that strategy wise? So the obvious language for hosting ML model inference is Python.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app