
Predictive Policing: Bias In, Bias Out
Data & Society
00:00
Introduction
Christian lum explores what happens when predictive police systems are trained on biased data. She elaborates on the concept of biasin bias out in machine learning, using a case study from oakland, california. Data bites are presented by data in society, a research institute in new york city focused on social, cultural and ethical issues arising from data centric technological development. For more information, visit to data society dot net.
Transcript
Play full episode