

Scaling Laws
Lawfare & University of Texas Law School
Scaling Laws explores (and occasionally answers) the questions that keep OpenAI’s policy team up at night, the ones that motivate legislators to host hearings on AI and draft new AI bills, and the ones that are top of mind for tech-savvy law and policy students. Co-hosts Alan Rozenshtein, Professor at Minnesota Law and Research Director at Lawfare, and Kevin Frazier, AI Innovation and Law Fellow at the University of Texas and Senior Editor at Lawfare, dive into the intersection of AI, innovation policy, and the law through regular interviews with the folks deep in the weeds of developing, regulating, and adopting AI. They also provide regular rapid-response analysis of breaking AI governance news. Hosted on Acast. See acast.com/privacy for more information.
Episodes
Mentioned books

Jul 21, 2022 • 56min
Online Speech and Section 230 After Dobbs
When the Supreme Court handed down its opinion in Dobbs v. Jackson Women’s Health Organization, overturning Roe v. Wade, the impact of the decision on the internet may not have been front of mind for most people thinking through the implications. But in the weeks after the Court’s decision, it’s become clear that the post-Dobbs legal landscape around abortion implicates many questions around not only data and digital privacy, but also online speech. One piece of model state legislation, for example, would criminalize “hosting or maintaining a website, or providing internet service, that encourages or facilitates efforts to obtain an illegal abortion.” This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Evan Greer, the director of the digital rights organization Fight for the Future. She recently wrote an article in Wired with Lia Holland arguing that “Section 230 is a Last Line of Defense for Abortion Speech Online.” They talked about what role Section 230’s protections have to play when it comes to liability for speech about abortion and what content moderation looks like in a post-Dobbs world. Hosted on Acast. See acast.com/privacy for more information.

Jul 14, 2022 • 58min
When Doctors Spread Disinformation
Since the beginning of the pandemic, we’ve talked a lot on this show about how falsehoods about the coronavirus are spread and generated. For this episode, Evelyn Douek and Quinta Jurecic spoke with two emergency medicine physicians who have seen the practical effects of those falsehoods while treating patients over the last two years. Nick Sawyer and Taylor Nichols are two of the cofounders of the organization No License for Disinformation, a group that advocates for medical authorities to take disciplinary action against doctors spreading misinformation and disinformation about COVID-19. They argue that state medical boards, which grant physicians the licenses that authorize them to practice medicine, could play a more aggressive role in curbing falsehoods. How many doctors have been disciplined, and why do Nick and Taylor believe that state medical boards have fallen down on the job? What are the possibilities for more aggressive action—and how does the First Amendment limit those possibilities? And how much good can the threat of discipline do in curbing medical misinformation, anyway? Hosted on Acast. See acast.com/privacy for more information.

Jul 7, 2022 • 1h 4min
What We Talk About When We Talk About Algorithms
Algorithms! We hear a lot about them. They drive social media platforms and, according to popular understanding, are responsible for a great deal of what’s wrong about the internet today—and maybe the downfall of democracy itself. But … what exactly are algorithms? And, given they’re not going away, what should they be designed to do?Evelyn Douek and Quinta Jurecic spoke with Jonathan Stray, a senior scientist at the Berkeley Center for Human-Compatible AI and someone who has thought a lot about what we mean when we say the word “algorithm”—and also when we discuss things like “engagement” and “amplification.” He helped them pin down a more precise understanding of what those terms mean and why that precision is so important in crafting good technology policy. They also talked about what role social media algorithms do and don’t play in stoking political polarization, and how they might be designed to decrease polarization instead.If you’re interested, you can read the Senate testimony by Dean Eckles on algorithms that Jonathan mentions during the show.We also mentioned this article by Daniel Kreiss on polarization. Hosted on Acast. See acast.com/privacy for more information.

Jun 30, 2022 • 54min
The Jan. 6 Committee Takes On the Big Lie
The House committee investigating the Jan. 6 insurrection is midway through a blockbuster series of hearings exploring Donald Trump’s efforts to overturn the 2020 election and disrupt the peaceful transfer of power. Central to those efforts, of course, was the Big Lie—the false notion that Trump was cheated out of victory in 2020.This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Kate Starbird, an associate professor of Human Centered Design & Engineering at the University of Washington—and repeat Arbiters of Truth guest. Kate has come on the show before to talk about misinformation and Jan. 6, and she and a team of coauthors just released a comprehensive analysis of tweets spreading misinformation around the 2020 election. So she’s the perfect person with whom to discuss the Jan. 6 committee hearings and misinformation. What does Kate’s research show about how election falsehoods spread, and who spread them? How has, and hasn’t, the Jan. 6 committee incorporated the role of misinformation into the story it’s telling about the insurrection? And is there any chance the committee can break through and get the truth to the people who most need to hear it? Hosted on Acast. See acast.com/privacy for more information.

Jun 23, 2022 • 51min
Rebroadcast: The Most Intense Online Disinformation Event in American History
If you’ve been watching the hearings convened by the House select committee on Jan. 6, you’ve seen a great deal about how the Trump campaign generated and spread falsehoods about supposed election fraud in 2020. As the committee has argued, those falsehoods were crucial in generating the political energy that culminated in the explosion of the January 6 insurrection. What shape did those lies take, and how did social media platforms attempt to deal with them at the time? Today, we’re bringing you an episode of our Arbiters of Truth series on the online information ecosystem. In fact, we’re rebroadcasting an episode we recorded in November 2020 about disinformation and the 2020 election. In late November 2020, after Joe Biden cemented his victory as the next president but while the Trump campaign was still pushing its claims of election fraud online and in court, Evelyn Douek and Quinta Jurecic spoke with Alex Stamos, the director of the Stanford Internet Observatory. Their conversation then was a great overview of the state of election security and the difficulty of countering false claims around the integrity of the vote. It’s worth a listen today as the Jan. 6 committee reminds us what the political and media environment was like in the aftermath of the election and how the Trump campaign committed to election lies that still echo all too loudly. And though it’s a year and a half later, the problems we’re discussing here certainly haven’t gone away. Hosted on Acast. See acast.com/privacy for more information.

Jun 16, 2022 • 56min
Defamation, Disinformation, and the Depp-Heard Trial
If you loaded up the internet or turned on the television somewhere in the United States over the last two months, it’s been impossible to avoid news coverage of the defamation trial of actors Johnny Depp and Amber Heard—both of whom sued each other over a dispute relating to allegations by Heard of domestic abuse by Depp. In early June, a Virginia jury found that both had defamed the other. The litigation has received a great deal of coverage for what it might say about the fate of the Me Too movement—but the flood of falsehoods online around the trial raises questions about how useful defamation law can really be in countering lies. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with RonNell Andersen Jones, the Lee E. Teitelbaum Professor of Law at the University of Utah College of Law and an expert on the First Amendment and the interaction between the press and the courts. Along with Lyrissa Lidsky, she’s written about defamation law, disinformation, and the Depp-Heard litigation. They talked about why some commentators think defamation could be a useful route to counter falsehoods, why RonNell thinks the celebrity litigation undercuts that argument, and the few cases in which claims of libel or slander really could have an impact in limiting the spread of lies. Hosted on Acast. See acast.com/privacy for more information.

Jun 9, 2022 • 60min
The Supreme Court Blocks the Texas Social Media Law
On May 31, by a five-four vote, the Supreme Court blocked a Texas law from going into effect that would have sharply limited how social media companies could moderate their platforms and required companies to abide by various transparency requirements. We’ve covered the law on this show before—we recorded an episode right after the U.S. Court of Appeals for the Fifth Circuit allowed Texas to implement the law, in the same ruling that the Supreme Court just vacated. But there’s enough interesting stuff in the Supreme Court’s order—and in Justice Samuel Alito’s dissent—that we thought it was worth another bite at the apple. So this week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic invited Genevieve Lakier, professor of law at the University of Chicago and Evelyn’s colleague at the Knight First Amendment Institute, to walk us through just what happened. What exactly did the Supreme Court do? Why does Justice Alito seem to think that the Texas law has a decent chance of surviving a First Amendment challenge? And what does this suggest about the possible futures of the extremely unsettled landscape of First Amendment law? Hosted on Acast. See acast.com/privacy for more information.

Jun 2, 2022 • 56min
Bringing in the Content Moderation Auditors
As transparency reporting about content moderation enforcement has become standard across the platform industry, there's been growing questions about the reliability and accuracy of the reports the platforms are producing. With all reporting being entirely voluntary and the content moderation industry in general being very opaque, it’s hard to know how much to trust the figures that companies report in their quarterly or biannual enforcement reports. As a result, there's been growing calls for independent audits of these figures, and last month, Meta released its first ever independent audit of its content moderation reporting systems. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with someone who actually knows something about auditing: Colleen Honigsberg, an associate professor of law at Stanford Law School, whose research is focused on the empirical study of corporate and securities law. They talked about how auditors work, the promises and pitfalls of auditing in other contexts and what that might teach us for auditing in the content moderation context, and whether this is going to be a useful regulatory tool. Hosted on Acast. See acast.com/privacy for more information.

May 26, 2022 • 58min
Social Media Platforms and the Buffalo Shooting
On May 14, a shooter attacked a supermarket in a historically Black neighborhood of Buffalo, New York, killing ten people and wounding three. The streaming platform Twitch quickly disabled the livestream the shooter had published of the attack—but video of the violence, and copies of the white supremacist manifesto released by the attacker online, continue to circulate on the internet. How should we evaluate the response of social media platforms to the tragedy in Buffalo? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Brian Fishman, who formerly worked at Facebook, now Meta, as the policy director for counterterrorism and dangerous organizations. Brian helped lead Facebook’s response to the 2019 Christchurch shooting, another act of far-right violence livestreamed online. He walked us through how platforms respond to crises like these, why it’s so difficult to remove material like the Buffalo video and manifesto from the internet, and what it would look like for platforms to do better. Hosted on Acast. See acast.com/privacy for more information.

May 19, 2022 • 59min
The Platforms versus Texas in the Supreme Court
On May 12, the U.S. Court of Appeals for the Fifth Circuit allowed an aggressive new Texas law regulating social media to go into effect. The law, known as HB20, seeks to restrict large social media platforms from taking down content on the basis of viewpoint—effectively restricting companies from engaging in a great deal of the content moderation that they currently perform. It also imposes a range of transparency and due process requirements on platforms with respect to their content moderation. A group of technology companies challenging the law have filed an emergency application to the Supreme Court seeking to put HB20 back on hold while they continue to litigate the law’s constitutionality under the First Amendment. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alex Abdo, litigation director at the Knight First Amendment Institute, and Scott Wilkens, senior staff attorney at Knight. The Institute, where Evelyn is a senior research fellow, filed an amicus brief in the Fifth Circuit, taking a middle ground between Texas—which argues that the First Amendment poses no bar to HB20—and the plaintiffs—who argue that the First Amendment prohibits this regulation and many other types of social media regulation besides. So what does the Texas law actually do? Where does the litigation stand—and what will the impact of the Fifth Circuit’s ruling be? And how does the Knight First Amendment Institute interpret, well, the First Amendment? Hosted on Acast. See acast.com/privacy for more information.