American Doom

The coming misinformation war

Jan 14, 2025
Katie Paul, Director of the Tech Transparency Project, shares her insights on the detrimental effects of misinformation in today's digital landscape. They discuss Meta's decision to dismantle its fact-checking program and its implications for society. The conversation highlights how shifts in social media moderation are influencing political discourse and youth exposure to extremist content. Paul also examines the global ramifications of misinformation and the contrasting regulatory responses from the US and EU, all while navigating Meta's changing leadership.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ANECDOTE

International Harm from Misinformation

  • Facebook's fact-checking program has been crucial internationally, particularly where it's a primary communication tool.
  • The Rohingya genocide and Indian lynchings highlight real-world harm caused by unchecked misinformation.
INSIGHT

Meta's Moderation Downfall

  • Fact-checking is only one aspect of Meta's moderation downfall.
  • Rolling back content moderation policies suggests the platform is becoming a free-for-all.
INSIGHT

Reactive Moderation

  • Meta's content moderation policies react to outside pressures, not proactive measures.
  • Many policies arose from PR crises, allowing Meta to claim action without real enforcement.
Get the Snipd Podcast app to discover more snips from this episode
Get the app