

Scaling Laws
Lawfare & University of Texas Law School
Scaling Laws explores (and occasionally answers) the questions that keep OpenAI’s policy team up at night, the ones that motivate legislators to host hearings on AI and draft new AI bills, and the ones that are top of mind for tech-savvy law and policy students. Co-hosts Alan Rozenshtein, Professor at Minnesota Law and Research Director at Lawfare, and Kevin Frazier, AI Innovation and Law Fellow at the University of Texas and Senior Editor at Lawfare, dive into the intersection of AI, innovation policy, and the law through regular interviews with the folks deep in the weeds of developing, regulating, and adopting AI. They also provide regular rapid-response analysis of breaking AI governance news. Hosted on Acast. See acast.com/privacy for more information.
Episodes
Mentioned books

Feb 4, 2022 • 55min
Is Block Party the Future of Content Moderation?
We talk a lot on this show about the responsibility of major tech platforms when it comes to content moderation. But what about problems the platforms can’t—or won’t—fix? Tracy Chou’s solution involves going around platforms entirely and creating tools that give power back to users to control their own experience. She’s the engineer behind Block Party, an app that allows Twitter users to protect themselves against online harassment and abuse. It’s a fine-tuned solution to a problem that a lot of Twitter users struggle with, especially women and particularly women of color. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Tracy about her work developing Block Party and how the persistent lack of diversity in Silicon Valley contributes to an environment where users have little protection against harassment. They also talked about what it’s like working with the platforms that Block Party and other apps like it are seeking to improve. And they discussed what content moderation problems these kinds of user-driven tools might help solve–and which they won’t. Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 56min
Defunding the Insurrectionists
As we’ve discussed on the show, online advertisements are the shifting, unstable sand on which the contemporary internet is built. And one of the many, many ways in which the online ad ecosystem is confusing and opaque involves how advertisers can find their ads popping up alongside content they’d rather not be associated with—and, all too often, not having any idea how that happened.This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke to Nandini Jammi and Claire Atkin of the Check My Ads Institute. Their goal is to serve as a watchdog for the ad industry, and they’ve just started a campaign to let companies know—and call them out—when their ads are showing up next to content published by far-right figures like Steve Bannon who supported the Jan. 6 insurrection. So what is it about the ads industry that makes things so opaque, even for the companies paying to have their ads appear online? What techniques do Claire and Nandini use to trace ad distribution? And how do advertisers usually respond when Check My Ads alerts them that they’re funding “brand unsafe” content? Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 1h 2min
Why the Online Advertising Market is Broken
In December 2020, ten state attorneys general sued Google, alleging that the tech giant had created an illegal monopoly over online advertising. The lawsuit is ongoing, and just this January, new allegations in the states’ complaint were freshly unsealed: the states have accused Google of tinkering with its ad auctions to mislead publishers and advertisers and expand its own power in the marketplace. (Google told the Wall Street Journal that the complaint was “full of inaccuracies and lacks legal merit.”)The complaint touches on a crucial debate about the online advertising industry: does it, well, work? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Tim Hwang, Substack’s general counsel and the author of the book “Subprime Attention Crisis: Advertising and the Time Bomb at the Heart of the Internet.” Tim argues that online advertising, which underpins the structure of the internet as we know it today, is a house of cards—that advertisers aren’t nearly as good as they claim at monetizing our attention, even as they keep marketing it anyway. So how worried should we be about this structure collapsing? If ads can’t convince us to buy things, what does that mean about our understanding of the internet? And what other possibilities are there for designing a better online space? Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 1h 1min
Podcasts Are the Laboratories of Misinformation
Valerie Wirtschafter and Chris Meserole, our friends at the Brookings Institution, recently published an analysis of how popular podcasters on the American right used their shows to spread the “big lie” that the 2020 election was stolen from Donald Trump. These are the same issues that led tech platforms to crack down on misinformation in the runup to the election—and yet, the question of whether podcast apps have a responsibility to moderate audio content on their platforms has largely flown under the radar. Why is that? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic talked through this puzzle with Valerie and Chris. They discussed their findings about podcasts and the “big lie,” why it’s so hard to detect misinformation in podcasting, and what we should expect when it comes to content moderation in podcasts going forward. Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 59min
Content Moderation After January 6
One year ago, a violent mob broke into the U.S. Capitol during the certification of the electoral vote, aiming to overturn Joe Biden’s victory and keep Donald Trump in power as the president of the United States. The internet played a central role in the insurrection: Trump used Twitter to broadcast his falsehoods about the integrity of the election and gin up excitement over January 6, and rioters coordinated ahead of time on social media and posted pictures afterwards of the violence. In the wake of the riot, a crackdown by major social media platforms ended with Trump suspended or banned from Facebook, Twitter and other outlets.So how have platforms been dealing with content moderation issues in the shadow of the insurrection? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic sat down for a discussion with Lawfare managing editor Jacob Schulz. To frame their conversation, they looked to the recent Twitter ban and Facebook suspension of Representative Marjorie Taylor Greene—which took place almost exactly a year after Trump’s ban. Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 57min
Working Toward Transparency and Accountability in Content Moderation
In 2018, a group of academics and free expression advocates convened in Santa Clara, California, for a workshop. They emerged with the Santa Clara Principles on Transparency and Accountability in Content Moderation—a high level list of procedural steps that social media companies should take when making decisions about the content on their services. The principles quickly became influential, earning the endorsement of a number of major technology companies like Facebook.Three years later, a second, more detailed edition of the principles has just been released—the product of a broader consultation process. So what’s changed? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with David Greene, senior staff attorney and civil liberties director at the Electronic Frontier Foundation. At EFF, he’s been centrally involved in the creation of version 2.0 of the principles. They talked about what motivated the effort to put together a new edition and what role he sees the principles playing in the conversation around content moderation. And they discussed amicus briefs that EFF has filed in the ongoing litigation over social media regulation laws passed by Texas and Florida. Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 57min
Free the Data!
On this show, we’ve discussed no end of proposals for how to regulate online platforms. But there’s something many of those proposals are missing: data about how the platforms actually work. Now, there’s legislation in Congress that aims to change that. The Platform Accountability and Transparency Act, sponsored by Senators Chris Coons, Rob Portman and Amy Klobuchar, would create a process through which academic researchers could gain access to information about the operation of these platforms—peering under the hood to see what’s actually happening in our online ecosystems, and perhaps how they could be improved. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with the man who drafted the original version of this legislation—Nate Persily, the James B. McClatchy Professor of Law at Stanford Law School. He’s been hard at work on the draft bill, which he finally published this October. And he collaborated with Coons, Portman and Klobuchar to work his ideas into the Platform Accountability and Transparency Act. They talked about how Nate’s proposal would work, why researcher access to data is so important and what the prospects are for lasting reforms like this out of Congress. Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 55min
Content Moderation’s Original ‘Decider’
We talk a lot about how content moderation involves a lot of hard decisions and trade-offs—but at the end of the day, someone has to make a decision about what stays on a platform and what comes down. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with “The Decider”—Nicole Wong, who earned that tongue-in-cheek nickname during her time at Google in the 2000s. As the company’s deputy general counsel, Nicole was in charge of decisionmaking over what content Google should remove or keep up in response to complaints from users and governments alike. Since then, she moved on to roles as Twitter’s legal director of products and the deputy chief technology officer of the United States under the Obama administration. In that time, the role of social media platforms in shaping society has grown enormously, but how much have content moderation debates really changed? Quinta and Evelyn spoke with Nicole about her time as the Decider, what’s new and what’s stayed the same since the early days of content moderation, and how her thinking about the danger and promise of the internet has changed over the years. Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 1h
How Zoom Thinks About Content Moderation
This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with some of the people behind the app that, by this point in the pandemic, you’re probably sick of: Zoom. Quinta and Evelyn sat down with Josh Kallmer, Zoom’s head of global public policy and government relations, and Josh Parecki, Zoom’s associate general counsel and head of trust and safety.Most of us have used Zoom regularly over the last few years thanks to COVID-19, but while you’re likely familiar with the platform as a mechanism for work meetings and virtual happy hours, you may not have thought about it in the context of content moderation. Josh and Josh explained the kinds of content moderation issues they grapple with in their roles at Zoom, how their moderation and user appeals process works, and why Zoom doesn’t think of itself like a phone line or a mail carrier, services that are almost entirely hands-off when it comes to the content they carry. Hosted on Acast. See acast.com/privacy for more information.

Feb 4, 2022 • 1h 7min
Rational Security's The 'Nothing To Be Thankful For' Edition
For Thanksgiving, we’re bringing you something a little different—an episode of Rational Security, our light, conversational show about national security and related topics. This week, Alan, Quinta and Scott were joined by special guest, Quinta's co-host of the Arbiters of Truth series on the Lawfare podcast feed Evelyn Douek! They sat down to discuss:—“Getting Rittenhoused”: A jury recently acquitted 17-year-old Kyle Rittenhouse of murder charges for shooting two men in what he claimed was self-defense during last summer’s unrest. What does his trial and its aftermath tell us about the intersection of politics with our criminal justice system?— “Now That’s a Power Serve”: A global pressure campaign by professional tennis players has forced Chinese officials to disclose the location of Chinese tennis player Peng Shuai, who disappeared after publicly accusing a former senior official of sexual assault. Is this a new model for dealing with Chinese human rights abuses?— “Duck Say Quack and Fish Go Blub—But What Did Fox Say?”: Two prominent conservative commentators have resigned from Fox News over its release of a Tucker Carlson film that they say spreads misinformation and promotes violence. Will this be enough to force the network to curb its behavior?For object lessons, Quinta endorsed her favorite pie dough recipe. Alan in turn made an unorthodox recommendation of what to put in that dough: sweet potato pie. Scott encouraged listeners to follow up that big meal with a cup of coffee, made on his beloved Aeropress with a Prismo filter attachment. And if that doesn't work, Evelyn suggested folks tuck in for a nap with her favorite weighted blanket from Bearaby. Hosted on Acast. See acast.com/privacy for more information.