

How AI is being abused to create child sexual abuse material
Sep 10, 2025
Grace Tame, an advocate for survivors of child sexual assault and director of The Grace Tame Foundation, highlights the alarming rise of AI-generated child sexual abuse material in Australia. She discusses the urgent need for specialized grooming prevention education and the potential for criminalizing harmful apps. Grace also confronts the tech industry's complicity in this issue, emphasizing the need for better governance and legislative reform to protect children. Her insights stress the critical importance of immediate actions from authorities to tackle the threats posed by AI.
AI Snips
Chapters
Transcript
Episode notes
AI Lowers Bar For Creating Harmful Material
- Technology accelerates production and distribution of child sexual abuse material, lowering barriers for offenders.
- AI image generators are often trained on photos of real children, making the harm linked to real abuse.
AI Aids Offenders Beyond Image Generation
- Chatbots and AI can be manipulated to produce exploitative descriptions and evade detection.
- Offenders use AI not just for images but to get tactics and legal-evasion advice.
Regulate Harmful 'Nudify' Apps And Equip Police
- Outlaw apps whose sole purpose is to remove clothing from photos without consent.
- Empower law enforcement with AI tools for victim identification and to detect AI-generated material.