

Wendy Chun rejects the unsustainable assumptions that govern networks & make technology undemocratic
Mar 24, 2023
54:47
Wendy Hui Kyong Chun holds the Research Chair in New Media in the School of Communication at Simon Fraser University. She’s also the Director of the Digital Democracies Institute there. The Digital Democracies Institute is a group of scholars and stakeholders from around the world who collaborate across disciplines to generate more democratic technologies and cultures. Wendy herself has studied both Systems Design Engineering and English Literature, which she uses to understand contemporary trends and threats within digital media and emerging technologies.
She is the author of books like Control and Freedom: Power and Paranoia in the Age of Fiber Optics, Programmed Visions: Software and Memory, Updating to Remain the Same: Habitual New Media, and Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition.
In this episode, Wendy and I talk about how existing network structures reinforce discrimination. She’s one of a string of theorists who have been critical of what she calls the “segregationist defaults” that exist within these networks that we’re supposed to assume are mechanical, other-than-human, and thus somehow devoid of prejudice. Instead, she says ‘no’: in fact, “Twentieth-century eugenics and twenty-first-century data analytics… both promote or presume segregation.”
This gives us a new way to approach the problem of political polarization. Chun argues that the assumption that people typically seek to associate with those that are like them—that look and think and act like alike—this assumption about a seemingly intuitive human tendency to group together in homogeneous ways is an assumption that historically produces itself as a fact. So it is not that homogeneous groups will somehow just naturally clash with other homogeneous groups, it is that an “unsustainable” assumption about homogeneity and homophily as baseline realities has obscured the inherent democratic virtue of difference and a diversity of worldviews. This “erases conflict,” but not in the sense of finding a way to cope with it or resolve it. And so, especially in an algorithm-driven era, polarization proliferates with overwhelming force.
We talk about these ideas that challenge the common sense assumptions that folks often have about the nature of contemporary technology, and also tackle things like facial recognition technology, the fact of artificial intelligence being an increasingly normal part of our lives. Wendy’s point is that facial recognition and machine learning are used in insidious, often exploitative, and almost always in discriminatory ways, but that they don’t need to be.
AI, she says, doesn’t need to be a “nightmare” that undermines and displaces “human decision-making.” What if these technologies were democratized? What if—and it may seem implausible, given the tech monopolies that silently govern many of our interactions through the diffusion of different technologies—what if there was broader public power and greater participation in deciding what AI should and shouldn’t do? The point here is to, as she says, “point to realities and futures [that need] to be rejected.” Prediction does us no good without power.