The work of human hands retains evidence of the humans who created the works. While this might seem obvious in the case of something like a painting, where the artist’s touch is the featured aspect, it’s much less obvious in things that aren’t supposed to betray their humanity. Take the algorithms that power search engines, which are expected to produce unvarnished and unbiased results, but which nonetheless reveal the thinking and implicit biases of their programmers.
While in an age where things like facial recognition or financial software algorithms are shown to uncannily reproduce the prejudices of their creators, this was much less obvious earlier in the century, when researchers like Safiya Umoja Noble were dissecting search engine results and revealing the sometimes appalling material they were highlighting.
In this Social Science Bites podcast, Noble -- the David O. Sears Presidential Endowed Chair of Social Sciences and professor of gender studies, African American studies, and information studies at the University of California, Los Angeles -- explains her findings, insights and recommendations for improvement with host David Edmonds.
And while we’ve presented this idea of residual digital bias as something somewhat intuitive, getting here was an uphill struggle, Noble reveals. “It was a bit like pushing a boulder up a mountain -- people really didn't believe that search engines could hold these kinds of really value-laden sensibilities that are programmed into the algorithm by the makers of these technologies. Even getting this idea that the search engine results hold values, and those values are biased or discriminatory or harmful, is probably the thrust of the contribution that I've made in a scholarly way.”
But through her academic work, such as directing the Center on Race & Digital Justice and co-directing of the Minderoo Initiative on Tech & Power at the UCLA Center for Critical Internet Inquiry and books like the 2018 title Algorithms of Oppression: How Search Engines Reinforce Racism, the scale of the problem and the harm it leaves behind are becoming known. Noble’s own contributions have been recognized, too, such as being named a MacArthur Foundation fellow in 2021 and the inaugural NAACP-Archewell Digital Civil Rights Award winner in 2022.