Twitter is trying to regulate a brain implant into society with not even tweezers at sort of like with salad tongs. I wonder if you define the attention metric a little more broadly, so it's not just the specific metric you're not allowed to optimize for,. But for being an attention company, you don't get these kinds of protections. Is that enough? That's exactly right. We don't want to measure the wrong things and optimize for the wrong things. The core of it was that we designed for the wrong thing from the beginning, without any metrics at all,Without any numbers at all, without any algrithms at all. twitter is still based on human perform
Back in January 2020, Tristan Harris went to Washington, D.C. to testify before the U.S. Congress on the harms of social media. A few weeks ago, he returned — virtually — for another hearing, Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds. He testified alongside Dr. Joan Donovan, Research Director at the Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy and the heads of policy from Facebook, YouTube and Twitter. The senators’ animated questioning demonstrated a deeper understanding of how these companies’ fundamental business models and design properties fuel hate and misinformation, and many of the lawmakers expressed a desire and willingness to take regulatory action. But, there’s still room for a more focused conversation. “It’s not about whether they filter out bad content,” says Tristan, “but really whether the entire business model of capturing human performance is a good way to organize society.” In this episode, a follow-up to last year’s “Mr. Harris Goes to Washington,” Tristan and Aza Raskin debrief about what was different this time, and what work lies ahead to pave the way for effective policy.