I think we should define when we use the word tragedy. That means something very specific because no individual company wants to create a world that's doom scrolling all day long. They just want a little bit more of your attention for itself and then collectively it creates this tragedy of the commons in which now we live in a collective tragedy of mindlessness. All right. So I hope that makes sense to the listeners for how to use the three rules and why I wish I had known about them when I was working on infinite scroll. And now maybe it makes sense to do a little bit of a deeper dive through each of the threerules and expand them out of it. Rule one is always the most
In our previous episode, we shared a presentation Tristan and Aza recently delivered to a group of influential technologists about the race happening in AI. In that talk, they introduced the Three Rules of Humane Technology. In this Spotlight episode, we’re taking a moment to explore these three rules more deeply in order to clarify what it means to be a responsible technologist in the age of AI.
Correction: Aza mentions infinite scroll being in the pockets of 5 billion people, implying that there are 5 billion smartphone users worldwide. The number of smartphone users worldwide is actually 6.8 billion now.
RECOMMENDED MEDIA
We Think in 3D. Social Media Should, Too
Tristan Harris writes about a simple visual experiment that demonstrates the power of one’s point of view
Let’s Think About Slowing Down AI
Katja Grace’s piece about how to avert doom by not building the doom machine
If We Don’t Master AI, It Will Master Us
Yuval Harari, Tristan Harris and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents in this New York Times opinion piece
RECOMMENDED YUA EPISODES
The AI Dilemma
Synthetic humanity: AI & What’s At Stake
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_