The Documentary Podcast

Assignment: Spain - can an algorithm predict murder?

10 snips
May 27, 2025
In a heart-wrenching inquiry, the podcast delves into the tragic murder of a woman deemed at medium risk by a predictive algorithm designed to protect her. It raises critical questions about the effectiveness and ethics of using technology like VioGen in combating gender-based violence. Personal testimonies reveal the systemic failures of law enforcement in safeguarding vulnerable victims. The discussion highlights both the flaws of risk assessment tools and the urgent need for reform to prevent further tragedies.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ANECDOTE

Tragic Failure of Viogen System

  • Lina was assessed at medium risk by the Viogen system before being killed by her ex-partner three weeks later.
  • Her story illustrates a tragic failure in predicting and preventing domestic violence through algorithmic risk assessment.
INSIGHT

Viogen’s Risk Assessment Process

  • Viogen uses a 35-question survey to assess risk and guides police resource allocation.
  • Only high and extreme risk levels trigger intensive police protection like 24-hour surveillance.
ANECDOTE

Sofia’s Harrowing Abuse Journey

  • Sofia, assessed medium risk by Viogen, endured repeated abuse and harassment after the court rejected her restraining order requests.
  • Her ex-husband escalated violence, eventually leading to a murder attempt before his imprisonment.
Get the Snipd Podcast app to discover more snips from this episode
Get the app