Tristan: I think what you're doing in France is so critical and why we could replicate that in the US or the EU. Guillaume has this project called Algo transparency which basically shows as much as it can, it scrapes YouTube and it shows these are the things that are getting most recommended. Tristan: Every time our values are pitted against engagement, our values lose. The platforms are choosing what goes into the information soil from which all of our collective sense and decision making abilities grow.
When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.