A lot of people think Guillaume probably when they hear your or my story, they think, oh, it's these greedy companies and they just want their money. So from the inside, I tried to be very positive mostly because we have these images of French are always complaining about... And so I didn't want to be this typical French that complained about things. If you see a problem, don't complain about it to the management, just fix it by yourself. So that's what I did. I saw this problem and I proposed an implemented solution with some people. But nothing ever got implemented. It wasn't because in my case, someone said, that's going to drop revenue
When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.