389: The Rise of the Compliant Speech Platform — With Daphne Keller
Nov 7, 2024
auto_awesome
Daphne Keller, Director of Platform Regulation at Stanford Cyber Policy Center, dives into the implications of the EU’s Digital Services Act on content moderation and freedom of expression. They discuss the complexities of platform compliance and the challenges of auditing social media under new regulations. Keller highlights the philosophical dilemmas around data use, emphasizing the pitfalls of relying solely on metrics to understand human behavior. The conversation also touches on the dynamics between tech giants and regulators, and the evolving landscape of digital governance.
The EU's Digital Services Act is reshaping content moderation by converting trust and safety teams into compliance-driven entities focused on standardized processes.
The regulatory pressures may disproportionately disadvantage smaller platforms, leading to a market dynamic that favors larger companies with more resources for compliance.
There are concerns that a strict compliance framework could result in government overreach, potentially silencing controversial yet legal speech under the guise of accountability.
Deep dives
The Transformation of Speech Governance
A significant shift is occurring in how major platforms manage online speech, driven by new regulations like the EU's Digital Services Act. This change is transforming trust and safety teams within these platforms into compliance-driven entities that standardize, track, and document content moderation decisions. The compliance approach mirrors the rigidity seen in financial regulations, requiring platforms to implement clear and consistent processes for content oversight. This fundamental reframing raises questions about the implications for free expression and speech governance in the digital landscape.
Regulatory Pressure and Market Dynamics
Emerging laws have prompted platforms, especially large ones, to ensure that their content moderation practices adhere to new legal frameworks. Many professionals in trust and safety are increasingly aware of these changes but feel that the general public remains largely uninformed about the details. There is a growing awareness that these regulatory pressures could stifle smaller platforms, which may struggle to meet new compliance demands due to limited resources. As platforms invest more in compliance infrastructures, it raises concerns about market dynamics favoring larger companies over innovative smaller entities.
Challenges of Content Moderation Standards
The issue of establishing objective standards for digital content moderation is complex and fraught with potential pitfalls. Auditors express concern that without established guidelines, their assessments could lack validity, leading to a mismatch between compliance and effective governance. Moreover, the challenge lies in balancing the need for accountability with the risk of excessive regulation that could inadvertently suppress vital speech. Critics point out that the quest for standardization may overlook the nuanced and evolving nature of human communication online.
The Potential for State Abuse and Overregulation
There exists a genuine risk that the compliance framework could lead to government overreach, potentially silencing legal but controversial speech. The mechanisms designed to ensure accountability might be exploited by state actors to impose their own biases on content moderation practices. Such dynamics complicate the relationship between platforms, users, and regulatory bodies, as the latter may inadvertently reinforce cycles of censorship. Continuing to monitor these developments is crucial to safeguarding democratic discourse and avoiding the pitfalls of overregulation.
The Future of Digital Platforms and Governance
The evolving regulatory environment may lead to a splintered internet, with larger platforms deploying extensive compliance systems while smaller ones struggle to adhere to standardized rules. As a result, we might witness a divergence in how different platforms operate, with some choosing to disregard compliance altogether. Additionally, as certain platforms withdraw from markets with stringent regulations, the risk arises that they may exacerbate existing inequities in information accessibility. In this ever-changing landscape, innovative approaches to governance could take root, prioritizing decentralized models that support free speech while still ensuring accountability.
Daphne Keller (Stanford Cyber Policy Center) and Corbin Barthold (TechFreedom) have a wide-ranging conversation about the impact of the EU’s Digital Services Act on content moderation, the costs and benefits of platform transparency, the pervasiveness of complexity, the work of James C. Scott, Germans’ abiding thirst for data, the Burmese heroin trade, and more. For more, see Daphne’s recent article in Lawfare, “The Rise of the Compliant Speech Platform.”