Snipd home pageGet the app
public
Data Skeptic chevron_right

Explainable K-Means

Mar 3, 2022
25:53
forum Ask episode
view_agenda Chapters
auto_awesome Transcript
info_circle Episode notes
1
Introduction
00:00 • 2min
chevron_right
2
Is Explainability a Hot Topic for Machine Learning?
01:36 • 2min
chevron_right
3
Unsupervised Learning
03:09 • 3min
chevron_right
4
Why You Don't Want to Ask Too Many Questions to Clusters?
05:48 • 2min
chevron_right
5
Clustering - Is There Any Concern That the Decision Tree Couldn't Explain?
07:45 • 2min
chevron_right
6
Having a Shorter Tree Depth
09:19 • 3min
chevron_right
7
Machine Learning - Clear M L Is the Best Collaborative M L Tool
12:01 • 2min
chevron_right
8
Do You Want a Better Partition?
13:47 • 2min
chevron_right
9
How to Use a Small Penalty in Clustering?
15:44 • 4min
chevron_right
10
How Does Yourogorithm Fare?
19:38 • 3min
chevron_right
11
How Many Clusters Should I Have?
22:10 • 4min
chevron_right

In this episode, Kyle interviews Lucas Murtinho about the paper "Shallow decision treees for explainable k-means clustering" about the use of decision trees to help explain the clustering partitions. 

Check out our website for extended show notes! Thanks to our Sponsors:ClearML is an open-source MLOps solution users love to customize, helping you easily Track, Orchestrate, and Automate ML workflows at scale.
HomeTop podcastsPopular guests