Cornell researchers Houston Claure and Malte Jung discuss the social consequences of 'machine allocation behavior' and how humans feel and behave differently when machines make decisions on resource allocation. Author Tom Kemp talks about his concerns on data collection and privacy in Silicon Valley, his involvement in privacy advocacy, and proposing a bill for better regulation of data brokers in California. The podcast also covers the lack of regulation on tech companies' information collection, solutions for addressing big tech dominance, and the importance of education and advocacy.
Machines making decisions that affect people's lives raise questions about fairness and the potential impact on human relationships.
Regulations and laws should address fairness in machine behavior and promote privacy, competition, and transparency in the tech industry.
Deep dives
Fairness and Monkeys Experiment
The podcast episode begins with a discussion about an experiment involving capuchin monkeys that highlights the concept of fairness. The monkeys were given the task of exchanging a rock for a treat from a researcher. When one monkey received a grape as a reward while the other received a cucumber, the monkey that received the cucumber displayed clear signs of dissatisfaction and frustration. This experiment demonstrates the innate sense of fairness that primates have and raises questions about the fairness of decisions made by machines and algorithms in various aspects of our lives, such as resume selection, dating app profiles, and gig worker assignments.
How Humans React to Machine Decisions
The episode then explores a study conducted by researchers on human reactions to machine allocation behavior. The study involved people playing a collaborative game of Tetris, where one player had more control over the game than the other. It was found that people immediately recognized when the turn allocation was unfair, whether it was decided by a human or an algorithm. Furthermore, when a person was favored by an algorithm, they tended to perceive their partner as less dominant, while this distinction did not occur when the favoritism came from a human. This suggests that people value the perceived fairness of machine decisions and can be influenced by them in their interactions with others.
Impacts of Machines on Human Relationships
The podcast episode emphasizes the need to understand the impact of machine behavior on human relationships. It discusses the potential consequences of machines increasingly making decisions that affect people's lives, such as job assignments, dating profiles, and resume selection. The episode highlights the importance of fairness in interactions and the potential for machines to influence how people perceive and relate to one another. It raises questions about how machines can be programmed to consider trade-offs between performance and fairness and encourages further research in this area.
Addressing Fairness in Machine Behavior
The episode concludes with a discussion on the need for regulations and laws to address fairness in machine behavior. It suggests the implementation of laws that require machines to consider fairness and behave in a way that is perceived as fair by humans. The podcast also highlights the importance of making regulations consumer-centric, ensuring the ease of exercising privacy rights and making AI processes transparent to consumers. The hope is that by addressing these issues, including the unfair charging practices and self-preferencing of big tech companies, progress can be made in promoting fairness, protecting privacy, and fostering competition in the tech industry.
This episode features two segments. In the first, Rebecca Rand considers the social consequences of "machine allocation behavior" with Cornell researchers Houston Claure and Malte Jung, authors of a recent paper on the topic with coauthors Seyun Kim and René Kizilcec.
In the second segment, Justin Hendrix speaks with Tom Kemp, author of a new book out August 22 from Fast Company Press titled Containing Big Tech: How to Protect Our Civil Rights, Economy, and Democracy.
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode