Exploring the alarming impact of deepfake images on victims, particularly young women, as they face the challenges of manipulation and dissemination online. The chapter delves into the gendered landscape of deep fakes, highlighting incidents involving influencers and child influencers, raising concerns about the repercussions and the urgent need for a cautious approach in addressing this issue. It also discusses the lack of regulation in the deepfake space, drawing parallels to past image leaks and emphasizing the scale of harm caused by non-consensual sharing.
Paris Marx is joined by Kat Tenbarge to discuss the proliferation of AI-generated, non-consensual sexual images, their impact on the victims, and the potential fallout for tech companies who helped make it all possible. Note there is some discussion of self-harm and suicide in this episode.
Kat Tenbarge is a tech and culture writer at NBC News.
Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Support the show on Patreon.
The podcast is made in partnership with The Nation. Production is by Eric Wickham. Transcripts are by Brigitte Pawliw-Fry.
Also mentioned in this episode:
- Kat has reported extensively on this issue, including stories about fake nude images of underage celebrities toping search engine results, nonconsensual deepfake porn showing up on Google and Bing too, Visa and Mastercard being used to fund the deepfake economy, and why plans for watermarking aren’t enough.
- Another Body is a documentary that looks at the scale of the problem of non-consensual deepfake explicit images.
- Microsoft’s Designer AI tool was used to create AI porn of Taylor Swift.
- Middle and high schools in Seattle, Miami, and Beverley Hills are among those already facing the consequences of AI-generated and deepfake nude images.
- In 2014, Jennifer Lawrence called the iCloud photo hack a “sex crime.”
Support the show