

Yoel Roth on Banning Trump, Battling Bots & the Difficult Job of Trust & Safety
"Content moderation decisions are like assholes," says Yoel Roth, the former head of trust & safety for Twitter. "Everybody's got one."
The underrated challenge of working in trust and safety is that every decision could affect millions of users, and the reasons for those decisions are often opaque. Today on Revolution.Social, Yoel and Rabble talk about what goes on behind the scenes when a platform like Twitter wants to do something like ban President Donald Trump; how moderation best practices can work on decentralized protocols; and the fallout of Russian interference in the 2016 U.S. elections.
" The most striking thing to me from a lot of that work was how a lot of the Russian accounts that we identified on Twitter weren't posting lies."
Chapters:
00:00 Introduction
03:42 Yoel's Origin Story
06:25 How Content Moderation Starts
08:56 Banning Trump
11:22 The Future of Social Media Protocols
16:12 Trust and Safety on Decentralized Platforms
21:12 Inauthentic Activity and Bots
28:22 The Arms Race Against LLMs
29:47 Community Self-Governance
38:28 No, You Need Moderation
42:09 The Homogeneity of Tech Founders
46:20 Should Twitter Promote Democracy?
48:59 Why Spam Really, Really Matters
51:31 Who Else Should Be on the Podcast?
Follow Rabble:
This episode was produced and edited by Eric Johnson from LightningPod.fm, and executive produced by Alice Chan from Flock Marketing.
To learn more about Rabble’s social media bill of rights, and sign up for our newsletter, visit https://revolution.social/