Forcing companies to disclose their own hand ind how much they've pushed a piece of content is a necessary prerequisite for the next step. So we go from amplification transparency to amplification liability, that the companies these platforms become liable for when it is their curatorial decision. This is essentially a shift, so that the entirety of that toxic foam that shows up on the balance sheets of ty will now make sure that it is a liability on their balance sheets. I can't ess how important i think this is. The companies are now knowing that they are going to have to be very careful about their recommendation elmens for because in he future, they will be held liable for all the toxic foam.
Back in January 2020, Tristan Harris went to Washington, D.C. to testify before the U.S. Congress on the harms of social media. A few weeks ago, he returned — virtually — for another hearing, Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds. He testified alongside Dr. Joan Donovan, Research Director at the Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy and the heads of policy from Facebook, YouTube and Twitter. The senators’ animated questioning demonstrated a deeper understanding of how these companies’ fundamental business models and design properties fuel hate and misinformation, and many of the lawmakers expressed a desire and willingness to take regulatory action. But, there’s still room for a more focused conversation. “It’s not about whether they filter out bad content,” says Tristan, “but really whether the entire business model of capturing human performance is a good way to organize society.” In this episode, a follow-up to last year’s “Mr. Harris Goes to Washington,” Tristan and Aza Raskin debrief about what was different this time, and what work lies ahead to pave the way for effective policy.