Tech and culture writer Kat Tenbarge discusses the crisis of non-consensual deepfake images, including underage celebrities, the role of tech companies like Visa and Mastercard, and the documentary 'Another Body.' The podcast sheds light on the alarming impact of AI-generated explicit images on victims, the lack of regulation in the deepfake space, and the disparity in tech companies' responses to diversity and deepfake issues.
Deepfake technology poses a serious threat to individuals' privacy and safety, especially among young people.
The deepfake industry profits from explicit content through various financial platforms, raising concerns about regulation and oversight.
Misuse of deepfake technology reinforces gender discrimination and hinders women's empowerment in male-dominated spaces.
Deep dives
Scale and Implications of Deepfake Technology
The advancements in deepfake technology have led to a rise in the production and spread of AI-generated explicit images, impacting both celebrities and everyday individuals. Unlike in the past, where obtaining real images was necessary, today, these images can be easily created using various apps available on mainstream app stores. The widespread accessibility of deepfake technology has resulted in cases where young individuals, including middle and high schoolers, fall victim to the creation and circulation of non-consensual explicit content. This global issue transcends borders, affecting victims worldwide with limited avenues for recourse due to the lack of specialized training and regulations.
Monetization and Financial Aspects of Deepfake Production
The deepfake industry thrives through various monetization strategies, with websites offering paid subscriptions for access to customized and extended content, including deepfake videos featuring individuals known to users. Payment processors like cryptocurrency wallets and mainstream financial institutions play a crucial role in enabling financial transactions for accessing deepfake content. The economy surrounding deepfakes presents challenges in regulation and oversight, with websites exploiting financial channels for profit, often evading strict regulations imposed on other adult content platforms.
Gender Impact and Social Ramifications of Deepfake Misuse
The misuse of deepfake technology has far-reaching social and gender implications, leading to fear and apprehension among women and girls in male-dominated fields due to potential exploitation through deepfake imagery. Instances of non-consensual deepfake creation have resulted in women disengaging from male-dominated industries and hesitating to participate in public and online spaces for fear of exploitation. The perpetuation of gender discrimination and the message of 'avoid visibility to prevent abuse' exacerbate existing societal challenges and hinder the empowerment and participation of women in various spheres.
Tech Companies' Negligence in Addressing Harmful AI Technology
Major tech companies are criticized for their negligence in addressing the harmful impact of AI technology. The podcast highlights the lack of consideration for the societal consequences of rapidly developing AI tools without proper regulations. Instances like the Taylor Swift incident, where Microsoft's generative AI tools were used, emphasize the urgent need for accountability and ethical oversight in the AI industry.
Challenges in Regulating Deepfakes and the Role of Search Engines
The podcast discusses the challenges in regulating deepfake content and the complicity of search engines like Google in disseminating harmful AI-generated material. Google's failure to proactively address the issue and the reliance on legal discussions rather than moral responsibility raise concerns about online safety and the need for stricter regulations. Additionally, the evolving social norms around victim shaming and the activism efforts by advocacy organizations indicate a growing awareness and push for accountability in combating deepfake technology.
Paris Marx is joined by Kat Tenbarge to discuss the proliferation of AI-generated, non-consensual sexual images, their impact on the victims, and the potential fallout for tech companies who helped make it all possible. Note there is some discussion of self-harm and suicide in this episode.
Kat Tenbarge is a tech and culture writer at NBC News.
Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Support the show on Patreon.
The podcast is made in partnership with The Nation. Production is by Eric Wickham. Transcripts are by Brigitte Pawliw-Fry.
Another Body is a documentary that looks at the scale of the problem of non-consensual deepfake explicit images.
Microsoft’s Designer AI tool was used to create AI porn of Taylor Swift.
Middle and high schools in Seattle, Miami, and Beverley Hills are among those already facing the consequences of AI-generated and deepfake nude images.
In 2014, Jennifer Lawrence called the iCloud photo hack a “sex crime.”