Game Theory for Model Interpretability: Shapley Values
Linear Digressions
00:00
Using a Shapley Value in a Complex Model to Make a Simpler Model
The shapley value tells you how important a given feature is for a given prediction and then uh, that helps you with the interpretation. What it's not doing is Deconstructing the model itself. It's kind of like turning features on and off or like adding them in and different Power set combinations looking at the Outputs.
Play episode from 23:15
Transcript


