If the technology confers power, it starts to race. There's a lot of subtle ways that technologies and design choices confer power. You know, we talk about other things in social media where social media sites that dose you with many likes and positive social feedback 100 times an hour are going to out compete the sites that don't give you frequent social feedback. The feeling of watching it move to being a race between the companies to figure out how to use it to keep people mindlessly scrolling took me a number of years to even let that in. It's a lot. Yeah.
In our previous episode, we shared a presentation Tristan and Aza recently delivered to a group of influential technologists about the race happening in AI. In that talk, they introduced the Three Rules of Humane Technology. In this Spotlight episode, we’re taking a moment to explore these three rules more deeply in order to clarify what it means to be a responsible technologist in the age of AI.
Correction: Aza mentions infinite scroll being in the pockets of 5 billion people, implying that there are 5 billion smartphone users worldwide. The number of smartphone users worldwide is actually 6.8 billion now.
RECOMMENDED MEDIA
We Think in 3D. Social Media Should, Too
Tristan Harris writes about a simple visual experiment that demonstrates the power of one’s point of view
Let’s Think About Slowing Down AI
Katja Grace’s piece about how to avert doom by not building the doom machine
If We Don’t Master AI, It Will Master Us
Yuval Harari, Tristan Harris and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents in this New York Times opinion piece
RECOMMENDED YUA EPISODES
The AI Dilemma
Synthetic humanity: AI & What’s At Stake
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_