

Trump, Twitter, and the First Amendment, with platform moderation expert Daphne Keller
Jan 12, 2021
Daphne Keller, Director of the Program on Platform Regulation at Stanford's Cyber Policy Center, discusses internet platform moderation in the wake of the Capitol attack. She highlights the challenges of balancing free speech with regulatory responsibilities, the implications of Section 230, and the power dynamics of tech giants. Keller also addresses the psychological toll on content moderators, potential consequences of repealing Section 230, and the need for evolving strategies in content moderation. A thought-provoking conversation on the future of online discourse.
AI Snips
Chapters
Transcript
Episode notes
Keller's Google Experience
- Daphne Keller encountered content moderation firsthand at Google, handling takedown requests from individuals like a Turkish politician.
- This politician wanted an article about their alleged corruption removed, highlighting the challenges platforms face.
Extraordinary Circumstances
- Platforms often cite "extraordinary circumstances" for content moderation decisions, as seen with Trump's ban after the Capitol attack.
- This raises questions about consistency and how such actions influence broader moderation policies.
Moderation Models
- Content moderation models range from small teams carefully reviewing decisions to large-scale outsourced operations.
- Automation supplements these, detecting duplicates, but at-scale moderation remains imperfect.