Scaling Laws

Lawfare & University of Texas Law School
undefined
Mar 3, 2022 • 59min

You Can’t Handle the Truth (Social)

Almost immediately since he was banned from Twitter and Facebook in January 2021, Donald Trump has been promising the launch of a new, Trump-run platform to share his thoughts with the world. In February 2022, that network—Truth Social—finally launched. But it’s been a debacle from start to finish, with a lengthy waitlist and a glitchy website that awaits users who finally make it online. Drew Harwell, who covers technology at the Washington Post, has been reporting on the less-than-smooth launch of Truth Social. This week on Arbiters of Truth, our podcast series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with him about who, exactly, this platform is for and who is running it. What explains the glitchy rollout? What’s the business plan … if there is one? And how does the platform fit into the ever-expanding universe of alternative social media sites for right-wing users? Hosted on Acast. See acast.com/privacy for more information.
undefined
Feb 24, 2022 • 59min

The Information War in Ukraine

Over the last several weeks, Russian aggression toward Ukraine has escalated dramatically. Russian President Vladimir Putin announced on Feb. 21 that Russia would recognize the sovereignty of two breakaway regions in Ukraine’s east, Donetsk and Luhansk, whose years-long effort to secede from Ukraine has been engineered by Russia. Russian troops have entered eastern Ukraine as supposed “peacekeepers,” and the Russian military has taken up positions along a broad stretch of Ukraine’s border.Along with the military dimensions of the crisis, there’s also the question of how various actors are using information to provoke or defuse violence. Russia has been spreading disinformation about supposed violence against ethnic Russians in Ukraine. The United States and its Western partners, meanwhile, have been releasing intelligence about Russia’s plans—and about Russian disinformation—at a rapid and maybe even unprecedented clip.So today on Arbiters of Truth, our series on the online information ecosystem, we’re bringing you an episode about the role of truth and falsehoods in the Russian attack on Ukraine. Evelyn Douek and Quinta Jurecic spoke with Olga Lautman, a non-resident senior fellow at the Center for European Policy Analysis—who has been tracking Russian disinformation in Ukraine—and Shane Harris, a reporter at the Washington Post—who has been reporting on the crisis. Hosted on Acast. See acast.com/privacy for more information.
undefined
Feb 17, 2022 • 57min

The Nuts and Bolts of Social Media Transparency

Brandon Silverman is a former Facebook executive and founder of the data analytics tool CrowdTangle. Brandon joined Facebook in 2016 after the company acquired CrowdTangle, a startup designed to provide insight into what content is performing well on Facebook and Instagram, and he left in October 2021, in the midst of a debate over how much information the company should make public about its platform. As the New York Times described it, CrowdTangle “had increasingly become an irritant” to Facebook’s leadership “as it revealed the extent to which Facebook users engaged with hyperpartisan right-wing politics and misleading health information.”This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Brandon about what we mean when we talk about transparency from social media platforms and why that transparency matters. They also discussed his work with the Congress and other regulators to advise on what legislation ensuring more openness from platforms would look like—and why it’s so hard to draft regulation that works. Hosted on Acast. See acast.com/privacy for more information.
undefined
Feb 10, 2022 • 50min

Spotify Faces the Content Moderation Music

The Joe Rogan Experience is perhaps the most popular podcast in the world—and it’s been at the center of a weeks-long controversy over COVID misinformation and content moderation. After Rogan invited on a guest who told falsehoods about the safety of COVID vaccines, outrage mounted toward Spotify, the podcasting and music streaming company that recently signed an exclusive deal with Rogan to distribute his show. Spotify came under pressure to intervene, as nearly 300 experts sent the company a letter demanding it take action, and musicians Neil Young and Joni Mitchell pulled their music from Spotify’s streaming service. And the controversy only seems to be growing. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Ashley Carman, a senior reporter at The Verge who writes the newsletter Hot Pod, covering the podcast and audio industry. She’s broken news on Spotify’s content guidelines and Spotify CEO’s Daniel Ek’s comments to the company’s staff, and we couldn’t think of a better person to talk to about this slow-moving disaster. How has Spotify responded to the complaints over Rogan, and what does that tell us about how the company is thinking about its responsibilities in curating content? What’s Ashley’s read on the state of content moderation in the podcast industry more broadly? And … is this debate even about content moderation at all? Hosted on Acast. See acast.com/privacy for more information.
undefined
Feb 4, 2022 • 55min

Is Block Party the Future of Content Moderation?

We talk a lot on this show about the responsibility of major tech platforms when it comes to content moderation. But what about problems the platforms can’t—or won’t—fix? Tracy Chou’s solution involves going around platforms entirely and creating tools that give power back to users to control their own experience. She’s the engineer behind Block Party, an app that allows Twitter users to protect themselves against online harassment and abuse. It’s a fine-tuned solution to a problem that a lot of Twitter users struggle with, especially women and particularly women of color. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Tracy about her work developing Block Party and how the persistent lack of diversity in Silicon Valley contributes to an environment where users have little protection against harassment. They also talked about what it’s like working with the platforms that Block Party and other apps like it are seeking to improve. And they discussed what content moderation problems these kinds of user-driven tools might help solve–and which they won’t. Hosted on Acast. See acast.com/privacy for more information.
undefined
Feb 4, 2022 • 56min

Defunding the Insurrectionists

As we’ve discussed on the show, online advertisements are the shifting, unstable sand on which the contemporary internet is built. And one of the many, many ways in which the online ad ecosystem is confusing and opaque involves how advertisers can find their ads popping up alongside content they’d rather not be associated with—and, all too often, not having any idea how that happened.This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke to Nandini Jammi and Claire Atkin of the Check My Ads Institute. Their goal is to serve as a watchdog for the ad industry, and they’ve just started a campaign to let companies know—and call them out—when their ads are showing up next to content published by far-right figures like Steve Bannon who supported the Jan. 6 insurrection. So what is it about the ads industry that makes things so opaque, even for the companies paying to have their ads appear online? What techniques do Claire and Nandini use to trace ad distribution? And how do advertisers usually respond when Check My Ads alerts them that they’re funding “brand unsafe” content? Hosted on Acast. See acast.com/privacy for more information.
undefined
Feb 4, 2022 • 1h 2min

Why the Online Advertising Market is Broken

In December 2020, ten state attorneys general sued Google, alleging that the tech giant had created an illegal monopoly over online advertising. The lawsuit is ongoing, and just this January, new allegations in the states’ complaint were freshly unsealed: the states have accused Google of tinkering with its ad auctions to mislead publishers and advertisers and expand its own power in the marketplace. (Google told the Wall Street Journal that the complaint was “full of inaccuracies and lacks legal merit.”)The complaint touches on a crucial debate about the online advertising industry: does it, well, work? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Tim Hwang, Substack’s general counsel and the author of the book “Subprime Attention Crisis: Advertising and the Time Bomb at the Heart of the Internet.” Tim argues that online advertising, which underpins the structure of the internet as we know it today, is a house of cards—that advertisers aren’t nearly as good as they claim at monetizing our attention, even as they keep marketing it anyway. So how worried should we be about this structure collapsing? If ads can’t convince us to buy things, what does that mean about our understanding of the internet? And what other possibilities are there for designing a better online space? Hosted on Acast. See acast.com/privacy for more information.
undefined
Feb 4, 2022 • 1h 1min

Podcasts Are the Laboratories of Misinformation

Valerie Wirtschafter and Chris Meserole, our friends at the Brookings Institution, recently published an analysis of how popular podcasters on the American right used their shows to spread the “big lie” that the 2020 election was stolen from Donald Trump. These are the same issues that led tech platforms to crack down on misinformation in the runup to the election—and yet, the question of whether podcast apps have a responsibility to moderate audio content on their platforms has largely flown under the radar. Why is that? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic talked through this puzzle with Valerie and Chris. They discussed their findings about podcasts and the “big lie,” why it’s so hard to detect misinformation in podcasting, and what we should expect when it comes to content moderation in podcasts going forward.  Hosted on Acast. See acast.com/privacy for more information.
undefined
Feb 4, 2022 • 59min

Content Moderation After January 6

One year ago, a violent mob broke into the U.S. Capitol during the certification of the electoral vote, aiming to overturn Joe Biden’s victory and keep Donald Trump in power as the president of the United States. The internet played a central role in the insurrection: Trump used Twitter to broadcast his falsehoods about the integrity of the election and gin up excitement over January 6, and rioters coordinated ahead of time on social media and posted pictures afterwards of the violence. In the wake of the riot, a crackdown by major social media platforms ended with Trump suspended or banned from Facebook, Twitter and other outlets.So how have platforms been dealing with content moderation issues in the shadow of the insurrection? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic sat down for a discussion with Lawfare managing editor Jacob Schulz. To frame their conversation, they looked to the recent Twitter ban and Facebook suspension of Representative Marjorie Taylor Greene—which took place almost exactly a year after Trump’s ban. Hosted on Acast. See acast.com/privacy for more information.
undefined
Feb 4, 2022 • 57min

Working Toward Transparency and Accountability in Content Moderation

In 2018, a group of academics and free expression advocates convened in Santa Clara, California, for a workshop. They emerged with the Santa Clara Principles on Transparency and Accountability in Content Moderation—a high level list of procedural steps that social media companies should take when making decisions about the content on their services. The principles quickly became influential, earning the endorsement of a number of major technology companies like Facebook.Three years later, a second, more detailed edition of the principles has just been released—the product of a broader consultation process. So what’s changed? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with David Greene, senior staff attorney and civil liberties director at the Electronic Frontier Foundation. At EFF, he’s been centrally involved in the creation of version 2.0 of the principles. They talked about what motivated the effort to put together a new edition and what role he sees the principles playing in the conversation around content moderation. And they discussed amicus briefs that EFF has filed in the ongoing litigation over social media regulation laws passed by Texas and Florida. Hosted on Acast. See acast.com/privacy for more information.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app