Engadget News + Next

Amazon discovered a 'high volume' of CSAM in its AI training data but isn't saying where it came from ... and more Tech News

Jan 30, 2026
They discuss Amazon finding a large amount of AI-related child sexual abuse material in its training data and the nonprofit concerns about undisclosed sources. They cover publishers blocking the Internet Archive to stop bots from scraping paywalled articles and the push for licensing or lawsuits. They explain China approving NVIDIA H200 sales and the geopolitics around high-end AI chips.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

High Volume Of CSAM Found In AI Training Data

  • Amazon reported a very high volume of AI-related CSAM found in its training data last year.
  • Amazon won't disclose sources, leaving many reports inactionable for law enforcement.
INSIGHT

NCMEC Finds Amazon Reports Inactionable

  • NCMEC said Amazon's nondisclosure made many reports unusable for investigations.
  • The nonprofit contrasted Amazon's reports with other companies whose data led to actionable leads.
INSIGHT

AI Safety Concerns Extend Beyond Training Data

  • Scanning and removing abusive content from training sets is a growing industry challenge.
  • AI chatbots have also been implicated in harmful incidents involving minors, widening safety concerns.
Get the Snipd Podcast app to discover more snips from this episode
Get the app