

Scaling Laws
Lawfare & University of Texas Law School
Scaling Laws explores (and occasionally answers) the questions that keep OpenAI’s policy team up at night, the ones that motivate legislators to host hearings on AI and draft new AI bills, and the ones that are top of mind for tech-savvy law and policy students. Co-hosts Alan Rozenshtein, Professor at Minnesota Law and Research Director at Lawfare, and Kevin Frazier, AI Innovation and Law Fellow at the University of Texas and Senior Editor at Lawfare, dive into the intersection of AI, innovation policy, and the law through regular interviews with the folks deep in the weeds of developing, regulating, and adopting AI. They also provide regular rapid-response analysis of breaking AI governance news. Hosted on Acast. See acast.com/privacy for more information.
Episodes
Mentioned books

Feb 4, 2022 • 1h 7min
Coordinating Inauthentic Behavior With Facebook’s Head of Security Policy
This week on Arbiters of Truth, our podcast on our online information ecosystem, Evelyn Douek and Quinta Jurecic bring you an episode they’ve wanted to record for a while: a conversation with Nathaniel Gleicher, the head of security policy at Facebook. He runs the corner of Facebook that focuses on identifying and tackling threats aimed at the platform, including information operations.They discussed a new report released by Nathaniel’s team on “The State of Influence Operations 2017-2020.” What kinds of trends is Facebook seeing? What is Nathaniel’s response to reports that Facebook is slower to act in taking down dangerous content outside the U.S.? What about the argument that Facebook is designed to encourage circulation of exactly the kind of incendiary content that Nathaniel is trying to get rid of?And, of course, they argued over Facebook’s use of the term “coordinated inauthentic behavior” to describe what Nathaniel argues is a particularly troubling type of influence operation. How does Facebook define it? Does it mean what you think it means? Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 55min
Information Operations, Then and Now
This week on Arbiters of Truth, our podcast on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Camille François, the chief innovation officer at Graphika, about a new report released by her team earlier this month on an apparent Russian influence operation aimed at so-called “alt-tech” platforms, like Gab and Parler. A group linked to the Russian Internet Research Agency “troll farm” has been posting far-right memes and content on these platforms over the last year. But how effective has their effort really been? What does the relatively small scale of the operation tell us about how foreign interference has changed in the last four years? Has the media’s—and the public’s—understanding of information operations caught up to that changing picture?One note: Camille references the “ABC framework” for understanding information operations. That’s referring to a framework she developed where operations can be understood along three vectors: manipulative actors, deceptive behavior and harmful content. Hosted on Acast. See acast.com/privacy for more information.

Feb 3, 2022 • 59min
A TikTok Tick Tock
TikTok has rapidly become one of the most popular apps for teenagers across the world for dancing, lip-syncing and sharing details about their lives. But if you cast your mind back to last year—specifically, August 2020—you may recall that the app’s future in the United States suddenly fell into doubt. The Trump administration began arguing that the app’s ownership by the Chinese company ByteDance raised problems of national security for the United States. ByteDance was ordered to divest from TikTok, and the app, along with the popular China-based chat app WeChat, faced U.S. sanctions.But you might have noticed that your teenager is still making TikTok videos. And President Biden issued his own executive order last week revoking Trump’s sanctions. So, what on earth is happening?On this week’s episode of our Arbiters of Truth series on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke to Bobby Chesney, Lawfare co-founder and Charles I. Francis Professor in Law at the University of Texas School of Law, about what’s happened to TikTok over the past year. Bobby brought us up to speed with the Trump administration’s offensive on TikTok, why the app has survived so far and why TikTok shouldn’t breathe easy just yet about Biden’s executive order. Hosted on Acast. See acast.com/privacy for more information.

Feb 3, 2022 • 59min
The Empire (Facebook) Strikes Back (at the Oversight Board’s Trump Decision)
If you’ve listened to this show, you've probably read a fair number of news stories—and maybe even listened to some podcast episodes—about the Facebook Oversight Board’s recent ruling on the platform’s decision to ban President Trump’s account. The board temporarily allowed Facebook to keep Trump off the platform, but criticized the slapdash way Facebook made that call and provided a long list of recommendations for Facebook to respond to.Well, now Facebook has responded—announcing that it will ban Trump from the platform for two years. And though the response hasn’t gotten as much coverage as the initial ruling, it’s arguably more important for what it says about both Facebook and the Facebook Oversight Board’s role in the future of content moderation.This week on the Lawfare Podcast's Arbiters of Truth series on our online information ecosystem, Quinta Jurecic interviewed Lawfare managing editor Jacob Schulz and Arbiters of Truth co-host Evelyn Douek about Facebook’s response to the board. What did Facebook say in addition to its two-year Trump ban? Why is Evelyn grumpy about it? And what’s next for Facebook, the Oversight Board and Trump himself? Hosted on Acast. See acast.com/privacy for more information.

Feb 3, 2022 • 60min
The Arrival of International Human Rights Law in Content Moderation
Way back at the beginning of the Arbiters of Truth podcast series on our online information ecosystem, Evelyn Douek and Quinta Jurecic invited David Kaye to talk about international human rights law (IHRL) and content moderation. David is a clinical professor of law at the University of California, Irvine, and when he was first on the show, he was also the United Nations Special Rapporteur on freedom of expression. It’s been a year and a half since then, and in the intervening time, David’s vision of IHRL as a guiding force for content moderation has become mainstream. So Quinta and Evelyn asked him back on to discuss the increasingly important role played by IHRL in content moderation—and what it really means in practice. They also talked about the rise of digital authoritarianism around the world and what international law and leading democracies can do about it. Hosted on Acast. See acast.com/privacy for more information.

Feb 3, 2022 • 1h
The Christchurch Call, Two Years On
In March 2019, a shooter carried out two mass killings at mosques in Christchurch, New Zealand, livestreaming the first shooting on Facebook. Two months later, New Zealand Prime Minister Jacinda Ardern and French President Emmanuel Macron convened the Christchurch Call—a commitment joined by both governments and technology companies “to eliminate terrorist and violent extremist content online.”It’s now been two years since the Christchurch Call. To discuss those years and what comes next, Evelyn Douek and Quinta Jurecic of the Arbiters of Truth series of the Lawfare Podcast spoke with Dia Kayyali, who serves as a co-chair of the Advisory Network to the Christchurch Call, a group of civil society organizations that work to ensure that the signatories to the Call consider a more diverse range of expertise and perspectives when implementing its commitments. Dia is a long-time digital rights activist and the associate director for advocacy at Mnemonic, an organization that works to preserve online documentation of human rights abuses. What has their experience been like as a voice for civil society in these conversations around the Call? What should we make of the recent decision by the Biden administration to sign the United States on to the call? And what are the risks of potentially over-aggressive moderation in an effort to take down “terrorist” content? Hosted on Acast. See acast.com/privacy for more information.

Feb 3, 2022 • 46min
The Disinformation Nextdoor
This week on Arbiters of Truth, the Lawfare Podcast's series on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with the journalist Will Oremus, who until recently was a senior writer at the technology publication OneZero and who is one of the most astute observers of online platforms and their relationship to the media. They dug into Will’s reporting on the social media platform Nextdoor. The app is designed to connect neighbors, but Will argues it’s filling the space left by collapsing local news—which may not be the best development when the platform is struggling with many of the common challenges of content moderation. And, of course, they also talked about the inescapable, ever-present elephant in the room—the Facebook Oversight Board’s ruling on Donald Trump’s account. Hosted on Acast. See acast.com/privacy for more information.

Feb 3, 2022 • 45min
The Facebook Oversight Board Rules on Trump
The wait is over. Four months after Facebook indefinitely banned Donald Trump from its platform following the Capitol riot, the Facebook Oversight Board—the platform’s self-appointed quasi-court—has weighed in on whether or not it was permissible for Facebook to do so. And the answer is ... complicated. Mark Zuckerberg can still keep Trump off his platform for now, but the board says that Facebook must review its policies and make a final decision about the former president’s fate within six months.To discuss the decision, Lawfare Editor-in-Chief Benjamin Wittes hosted a special episode of Arbiters of Truth, our Lawfare Podcast miniseries on our online information ecosystem. He sat down with Evelyn Douek, Quinta Jurecic and Lawfare Deputy Managing Editor Jacob Schulz for a conversation about the Oversight Board’s ruling. Did the Oversight Board make the right call? What might the mood be like in Facebook headquarters right now? What about Twitter’s? And is this decision really the Oversight Board’s Marbury v. Madison moment? Hosted on Acast. See acast.com/privacy for more information.

Feb 3, 2022 • 59min
Israel’s 'Cyber Unit' and Extra-legal Content Take-downs
Odds are, you probably haven’t heard of the Israeli government’s “Cyber Unit,” but it’s worth paying attention to whether or not you live in Israel and the Palestinian territories. It’s an entity that, among other things, reaches out to major online platforms like Facebook and Twitter with requests that the platforms remove content. It’s one of a number of such agencies around the globe, which are known as Internet Referral Units. Earlier in April, the Israeli Supreme Court gave a green light to the unit’s activities, rejecting a legal challenge that charged the unit with infringing on constitutional rights.This week on Arbiters of Truth, the Lawfare Podcast’s miniseries on our online information ecosystem, Evelyn Douek and Quinta Jurecic talked to Fady Khoury and Rabea Eghbariah, who were part of the legal team that challenged the Cyber Unit’s work on behalf of Adalah, the Legal Center for Arab and Minority Rights in Israel. Why do they—and many other human rights activists–find Internet Referral Units so troubling, and why do governments like the units so much? Why did the Israeli Supreme Court disagree with Fady and Rabea’s challenge to the unit’s activities? And what does the Court’s decision say about the developing relationship between countries’ legal systems and platform content moderation systems? Hosted on Acast. See acast.com/privacy for more information.

Feb 3, 2022 • 55min
The Challenges of Audio Content Moderation
This week on Arbiters of Truth, the Lawfare Podcast’s miniseries on our online information ecosystem, Evelyn Douek and Quinta Jurecic talked to Sean Li, who until recently was the head of Trust and Safety at Discord. Discord is experiencing phenomenal growth and is an established player in a space that is the new hot thing: audio social media. And as the head of Trust and Safety, Sean was responsible for running the team that mitigates all the bad stuff that happens on a platform.Evelyn and Quinta asked Sean what it’s like to have that kind of power—to be the eponymous “arbiter of truth” of a slice of the internet. They also discussed what makes content moderation of live audio content different from the kind we normally talk about—namely, text-based platforms. As almost every social media platform is trying to get into audio, what should they be prepared for? Hosted on Acast. See acast.com/privacy for more information.