Sensemaker

Why do MPs want to boycott Elon Musk’s X over sexualised AI images?

Jan 12, 2026
Daisy Dixon, a Cardiff University lecturer and author of the upcoming book *Depraved*, discusses her experience as a victim of non-consensual AI-generated sexualized images. Claire McGlynn, a law professor and expert on pornography regulation, provides insights into the UK's legal framework regarding intimate images. They delve into the alarming scale of these AI-generated images, including those depicting minors, and criticize the urgent need for accountability from platforms like X. They also explore the legal gaps surrounding the enforcement of new AI deepfake laws.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ANECDOTE

Victim's Profile Photo Altered And Sexualised

  • Daisy Dixon found users taking and sexualising her profile and gym photos using Grok prompts.
  • She described feeling violated and alienated by images that showed obvious sexual alterations.
INSIGHT

Scale And Child Harm In Grok Imagery

  • An analysis found over half of Grok-generated images were non-consensual sexual images and 2% depicted someone appearing under 18.
  • The Internet Watch Foundation discovered criminal imagery of girls aged 11–13 created using Grok.
INSIGHT

Grok's Brand And Risky Features

  • Grok brands itself sarcastic and 'anti-woke' and includes a 'spicy vote' to generate porn-adjacent content.
  • That positioning and features made it one of the first mainstream platforms to allow this kind of sexual content generation.
Get the Snipd Podcast app to discover more snips from this episode
Get the app