The attention economy and technology hijacking our minds are three phrases that started to colonize the public discourse. When we have a phrase that describes the problem, instead of talking about some bad videos on YouTube, we're describing the problem not in a systemic way. We might want to put out a call to the heads of Unilever, PNG to be really aware of the systemic problem here. Right now these guys respond when there's a specific issue, like child pedophilia or anti-vaccine promotion.
When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.