Tech Won't Save Us

Elon Musk Profits Off Non-Consensual Deepfakes w/ Kat Tenbarge

28 snips
Jan 29, 2026
Kat Tenbarge, an independent journalist who covers image-based sexual abuse and AI deepfakes. She walks through how Grok and X enabled mass non-consensual nudification, platform failures and moderation cuts that amplified harm. They discuss child safety risks, monetization of abusive images, legal and policy gaps, and why leaving or pressuring the platform matters.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Grok Sparked A Rapid Deepfake Surge

  • Grok's rollback of safeguards created a cultural tipping point that massively accelerated non-consensual image production.
  • The event transformed a niche abuse practice into millions of generated images within weeks.
ANECDOTE

Public Replies Used To Sexually Edit Photos

  • Users replied to women's photos on X with prompts like "put her in a bikini," and Grok edited the images publicly.
  • Researchers measured thousands of harmful images per hour at the peak of this behavior.
INSIGHT

X Was Already A Deepfake Hub

  • X had long been the main venue for deepfakes, hosting an underground market that evolved into viral abuse.
  • Platform moderation remained reactive and inadequate, letting content proliferate until it went viral.
Get the Snipd Podcast app to discover more snips from this episode
Get the app