I noticed like from two years ago that YouTube recommendations were showing a lot of anti-vaccine conspiracy theories. So they were like, for instance, Bill Maher, who was saying, hey, don't take the flu shot. And now if we take this more full stack sort of socio-emotional understanding of why this is happening, then think about in a parent. Why is it so compelling to watch a video like that? Because the idea that you would inject your child with something that would give them autism is so fear inducing. This is also, people need to understand, it's what a global problem this is.
When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.