How can data be biased? Isn’t it supposed to be an objective reflection of the real world? We all know that these are somewhat naive rhetorical questions, since data can easily inherit bias from the people who collect and analyze it, just as an algorithm can make biased suggestions if it’s trained on biased datasets. A better question is, how do biases creep in, and what can we do about them? Catherine D’Ignazio is an MIT professor who has studied how biases creep into our data and algorithms, and even into the expression of values that purport to protect objective analysis. We discuss examples of these processes and how to use data to make things better.
Support Mindscape on Patreon.
Catherine D’Ignazio received a Master of Fine Arts from Maine College of Art and a Master of Science in Media Arts and Sciences from the MIT Media Lab. She is currently an assistant professor of Urban Science and Planning and Director of the Data+Feminism Lab at MIT. She is the co-author, with Lauren F. Klein, of the book Data Feminism.
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.