1135: Sandra Matz | How Algorithms Read and Reveal the Real You
Apr 1, 2025
auto_awesome
Psychologist Sandra Matz dives into how algorithms harvest a staggering 6GB of your data every hour, unveiling insights about your mental health, political views, and more. She discusses the alarming practices of companies like Facebook, which identified depressed teenagers but chose profit over support. Matz also explains how just 300 likes can reveal your personality better than those closest to you. She highlights the ethical dilemmas of data privacy and introduces innovative data co-ops that help individuals regain control over their digital footprints.
Companies collect an astonishing 6GB of data per hour on individuals, enabling them to predict personal attributes and states such as mental health or politics.
The unethical prioritization of profit over well-being is evident as algorithms identify vulnerable individuals but instead sell data for advertising rather than providing support.
Data co-ops present a viable solution for individuals to regain control over their personal information and ensure it benefits collective interests rather than corporate profit.
The pressing need for regulatory measures is highlighted as current protections are inadequate, leaving many individuals exposed to privacy invasions and potential data misuse risks.
Deep dives
Granular Psychological Targeting
Companies utilize psychological targeting to gather intimate data from users based on digital footprints, which can include social media posts, GPS location, and credit card transactions. This information is analyzed to deduce personal characteristics such as personality traits, political ideologies, and even sexual orientation. Such precise targeting reveals how algorithms have the capacity to profile individuals more accurately than close friends or family. The discussion highlights the alarming implications of this data collection, particularly in understanding mental health issues and how it can potentially lead to harmful outcomes.
Data Tracking and Privacy Erosion
Data tracking is pervasive and occurs even when individuals believe they have privacy. Users often assume that disabling features like GPS location can protect their data, but multiple sensors on smartphones continue to track their behavior. Even everyday activities such as purchasing habits and social interactions contribute to a detailed profile that tech companies maintain. This comprehensive tracking not only facilitates targeted advertising but also raises concerns about individual privacy and the potential misuse of data.
The Impact of Algorithms on Mental Health
Algorithms have been shown to predict mental health issues based on user behavior, and companies have been accused of monetizing this information rather than using it to offer help. For instance, there are claims that Facebook tracked teens' mental health struggles and used that data for advertising purposes. This raises ethical questions about the responsibility of tech companies to protect vulnerable individuals rather than exploit them. The potential for technology to identify at-risk populations highlights the urgent need for proactive measures to support mental health.
Data Ownership and Co-ops
The idea of data co-ops is presented as a promising solution for individuals to reclaim ownership of their data. These cooperatives would allow members to pool their information willingly, ensuring it is used for mutual benefit rather than exploitation by large corporations. One example mentioned is a patient group with multiple sclerosis that collaborates to share health data, which can lead to improved treatment options. This model emphasizes the potential for collective data management in fostering better outcomes and ethical uses in various sectors.
Regulatory Challenges and Future Implications
The discussion highlights the challenges surrounding regulatory measures for data collection and usage, particularly as technology evolves. Current legislative efforts might focus on protecting specific groups, like judges, but many other individuals remain unprotected from privacy invasions. As algorithms become more sophisticated and predictive, there is a pressing need for regulations that address the potential dangers of data misuse. Maintaining individual rights while navigating this complex landscape is a crucial consideration for policymakers.
The Dangers of Data Visibility
The episode delves into real-world consequences of data visibility, such as the alarming ease with which someone could use personal data to commit harm. For example, a case was presented involving a judge whose son was murdered after data from a previous case was obtained through data brokers. This underscores the critical need for stricter controls on data accessibility and the dangers posed by having granular details about individuals widely available. It prompts a conversation about who should have access to personal data and the moral implications of that access.
Hope for a Better Future
Despite the disheartening realities of digital privacy and data misuse, there are grounds for optimism regarding the future of technology and data rights. The discussion encourages the exploration of positive use cases for data that prioritize user consent and ethical practices. It emphasizes the importance of creating narratives that focus on beneficial applications of technology rather than solely on the risks involved. By promoting positive changes and responsible data practices, there is potential for a safer digital environment for everyone.
Companies harvest 6GB of your data hourly. Psychologist Sandra Matz explains how they predict everything from depression to politics—and how to fight back.
Companies collect ~6GB of data per hour on individuals through social media, credit cards, smartphones, and location tracking, enabling predictions about personality, politics, and mental health.
Facebook identified depressed teenagers in 2015 and sold this information to advertisers rather than providing support, prioritizing profit over well-being.
Algorithms need just 300 likes to know someone better than their spouse, while facial recognition can determine sexual orientation with 81% accuracy from facial features alone.
"Anonymized" data isn't truly anonymous — three credit card transactions can uniquely identify a person, revealing unintentional information beyond our curated online personas.
Data co-ops offer a practical solution for regaining control. MS patients in Europe and Uber drivers in the US have formed co-ops to collectively manage their data, allowing them to benefit from data aggregation while maintaining ownership and directing outcomes toward their shared interests rather than corporate profit.