

MLOps Coffee Sessions #1: Serving Models with Kubeflow
Jun 13, 2020
Exploring the world of model serving in machine learning, discussing serverless concepts, API endpoints, streaming and batch data, with a sprinkle of coffee vs tea banter. They touch on real-time prediction scenarios, optimizing model serving using Kubeflow, and challenges of deploying models in production. Delve into the practical applications of Kubeflow, model training with the Iris dataset, building custom model services, and planning in-depth MLOps discussions with audience engagement.
Chapters
Transcript
Episode notes
1 2 3 4 5 6 7
Introduction
00:00 • 2min
Deploying and Customizing Machine Learning Models on Kubeflow
02:27 • 8min
Optimizing Model Serving with Kubeflow
10:52 • 10min
Exploration of Learning Process & Model Training with Iris Dataset
21:15 • 3min
Building and Serving Models with Kubeflow
24:28 • 17min
Planning In-Depth MLOps Discussions and Encouraging Audience Engagement
41:34 • 2min
Optimizing Model Serving with Kubeflow Infrastructure
43:47 • 6min