Scaling Laws

Lawfare & University of Texas Law School
undefined
Jun 16, 2022 • 56min

Defamation, Disinformation, and the Depp-Heard Trial

If you loaded up the internet or turned on the television somewhere in the United States over the last two months, it’s been impossible to avoid news coverage of the defamation trial of actors Johnny Depp and Amber Heard—both of whom sued each other over a dispute relating to allegations by Heard of domestic abuse by Depp. In early June, a Virginia jury found that both had defamed the other. The litigation has received a great deal of coverage for what it might say about the fate of the Me Too movement—but the flood of falsehoods online around the trial raises questions about how useful defamation law can really be in countering lies. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with RonNell Andersen Jones, the Lee E. Teitelbaum Professor of Law at the University of Utah College of Law and an expert on the First Amendment and the interaction between the press and the courts. Along with Lyrissa Lidsky, she’s written about defamation law, disinformation, and the Depp-Heard litigation. They talked about why some commentators think defamation could be a useful route to counter falsehoods, why RonNell thinks the celebrity litigation undercuts that argument, and the few cases in which claims of libel or slander really could have an impact in limiting the spread of lies. Hosted on Acast. See acast.com/privacy for more information.
undefined
Jun 9, 2022 • 60min

The Supreme Court Blocks the Texas Social Media Law

On May 31, by a five-four vote, the Supreme Court blocked a Texas law from going into effect that would have sharply limited how social media companies could moderate their platforms and required companies to abide by various transparency requirements. We’ve covered the law on this show before—we recorded an episode right after the U.S. Court of Appeals for the Fifth Circuit allowed Texas to implement the law, in the same ruling that the Supreme Court just vacated. But there’s enough interesting stuff in the Supreme Court’s order—and in Justice Samuel Alito’s dissent—that we thought it was worth another bite at the apple. So this week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic invited Genevieve Lakier, professor of law at the University of Chicago and Evelyn’s colleague at the Knight First Amendment Institute, to walk us through just what happened. What exactly did the Supreme Court do? Why does Justice Alito seem to think that the Texas law has a decent chance of surviving a First Amendment challenge? And what does this suggest about the possible futures of the extremely unsettled landscape of First Amendment law? Hosted on Acast. See acast.com/privacy for more information.
undefined
Jun 2, 2022 • 56min

Bringing in the Content Moderation Auditors

As transparency reporting about content moderation enforcement has become standard across the platform industry, there's been growing questions about the reliability and accuracy of the reports the platforms are producing. With all reporting being entirely voluntary and the content moderation industry in general being very opaque, it’s hard to know how much to trust the figures that companies report in their quarterly or biannual enforcement reports. As a result, there's been growing calls for independent audits of these figures, and last month, Meta released its first ever independent audit of its content moderation reporting systems. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with someone who actually knows something about auditing: Colleen Honigsberg, an associate professor of law at Stanford Law School, whose research is focused on the empirical study of corporate and securities law. They talked about how auditors work, the promises and pitfalls of auditing in other contexts and what that might teach us for auditing in the content moderation context, and whether this is going to be a useful regulatory tool.  Hosted on Acast. See acast.com/privacy for more information.
undefined
May 26, 2022 • 58min

Social Media Platforms and the Buffalo Shooting

On May 14, a shooter attacked a supermarket in a historically Black neighborhood of Buffalo, New York, killing ten people and wounding three. The streaming platform Twitch quickly disabled the livestream the shooter had published of the attack—but video of the violence, and copies of the white supremacist manifesto released by the attacker online, continue to circulate on the internet. How should we evaluate the response of social media platforms to the tragedy in Buffalo? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Brian Fishman, who formerly worked at Facebook, now Meta, as the policy director for counterterrorism and dangerous organizations. Brian helped lead Facebook’s response to the 2019 Christchurch shooting, another act of far-right violence livestreamed online. He walked us through how platforms respond to crises like these, why it’s so difficult to remove material like the Buffalo video and manifesto from the internet, and what it would look like for platforms to do better. Hosted on Acast. See acast.com/privacy for more information.
undefined
May 19, 2022 • 59min

The Platforms versus Texas in the Supreme Court

On May 12, the U.S. Court of Appeals for the Fifth Circuit allowed an aggressive new Texas law regulating social media to go into effect. The law, known as HB20, seeks to restrict large social media platforms from taking down content on the basis of viewpoint—effectively restricting companies from engaging in a great deal of the content moderation that they currently perform. It also imposes a range of transparency and due process requirements on platforms with respect to their content moderation. A group of technology companies challenging the law have filed an emergency application to the Supreme Court seeking to put HB20 back on hold while they continue to litigate the law’s constitutionality under the First Amendment. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alex Abdo, litigation director at the Knight First Amendment Institute, and Scott Wilkens, senior staff attorney at Knight. The Institute, where Evelyn is a senior research fellow, filed an amicus brief in the Fifth Circuit, taking a middle ground between Texas—which argues that the First Amendment poses no bar to HB20—and the plaintiffs—who argue that the First Amendment prohibits this regulation and many other types of social media regulation besides. So what does the Texas law actually do? Where does the litigation stand—and what will the impact of the Fifth Circuit’s ruling be? And how does the Knight First Amendment Institute interpret, well, the First Amendment? Hosted on Acast. See acast.com/privacy for more information.
undefined
May 12, 2022 • 55min

When Governments Turn Off the Internet

Internet blackouts are on the rise. Since 2016, governments around the world have fully or partially shut down access to the internet almost 1000 times, according to a tally by the human rights organization Access Now. As the power of the internet grows, this tactic has only become more common as a means of political repression. Why is this and how, exactly, does a government go about turning off the internet? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke on this topic with Peter Guest, the enterprise editor for the publication Rest of World, which covers technology outside the regions usually described as the West. He’s just published a new project with Rest of World diving deep into internet shutdowns—and the three dug into the mechanics of internet blackouts, why they’re increasing and their wide-reaching effects. Hosted on Acast. See acast.com/privacy for more information.
undefined
May 5, 2022 • 59min

Pay Attention to Europe’s Digital Services Act

While the U.S. Congress has been doing hearing after hearing with tech executives that include a lot of yelling and not much progress, Europe has been quietly working away on some major tech regulations. Last month, it reached agreement on the content moderation piece of this package: the Digital Services Act. It's sweeping in scope and likely to have effects far beyond Europe. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with Daphne Keller, the director of the Program on Platform Regulation at the Stanford Cyber Policy Center, to get the rundown. What exactly is in the act? What does she like and what doesn't she? And how will the internet look different once it comes into force? Hosted on Acast. See acast.com/privacy for more information.
undefined
Apr 28, 2022 • 58min

The Professionalization of Content Moderation

This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek spoke to Charlotte Willner, who has been working in content moderation longer than just about anyone. Charlotte is now the executive director of the Trust and Safety Professionals Association, an organization that brings together the professionals that write and enforce the rules for what’s fair game and what’s not on online platforms. Before that, she worked in Trust and Safety at Pinterest and before that she built the very first safety operations team at Facebook. Evelyn asked Charlotte what it was like trying to build a content moderation system from the ground up, what has changed since those early days (spoilers: it’s a lot!) and—of course—if she had any advice for Twitter’s new owner given all her experience helping keep platforms safe. Hosted on Acast. See acast.com/privacy for more information.
undefined
Apr 21, 2022 • 34min

Taylor Lorenz on Taking Internet Culture Seriously

This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with a reporter who has carved out a unique beat writing about not just technology but the creativity and peculiarities of the people who use it—Taylor Lorenz, a columnist at the Washington Post covering technology and online culture. Her recent writing includes reporting on “algospeak”—that is, how algorithmic amplification changes how people talk online—and coverage of the viral Twitter account Libs of TikTok, which promotes social media posts of LGBTQ people for right-wing mockery. They talked about the quirks of a culture shaped in conversation with algorithms, the porous border between internet culture and political life in the United States, and what it means to take the influence of social media seriously, for good and for ill. Hosted on Acast. See acast.com/privacy for more information.
undefined
Apr 14, 2022 • 60min

Bringing Evidence of War Crimes From Twitter to the Hague

The internet is increasingly emerging as a source for identification and documentation of war crimes, as the Russian invasion of Ukraine has devastatingly proven yet again. But how does an image of a possible war crime go from social media to before a tribunal in a potential war crimes prosecution? On a recent episode of Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Nick Waters, the lead on Justice and Accountability at Bellingcat, about how open-source investigators go about documenting evidence of atrocity. This week on the show, Evelyn and Quinta interviewed Alexa Koenig, the executive director of the Human Rights Center at the University of California, Berkeley, and an expert on using digital evidence for justice and accountability. They talked about how international tribunals have adapted to using new forms of evidence derived from the internet, how social media platforms have helped—and hindered—collection of this kind of evidence, and the work Alexa has done to create a playbook for investigators downloading and collecting material documenting atrocities.Because of the nature of the conversation, this discussion contains some descriptions of violence that might be upsetting for some listeners.  Hosted on Acast. See acast.com/privacy for more information.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app