I think we were all surprised how many incredible people did sign the letter. It's like saying, you know, just because something goes hard doesn't mean it shouldn't be the intention. We need to get good at coordination. All of our world's problems are coordination problems. I would say unregulated deployment of AI would be the reason we lose to China. If worse actors do beat you in dominance in deploying AI, with no morals, with no safety considerations, with different values as a future of the world kind of society then we won't want to lose to that.
There’s really no one better than veteran tech journalist Kara Swisher at challenging people to articulate their thinking. Tristan Harrris recently sat down with her for a wide ranging interview on AI risk. She even pressed Tristan on whether he is a doomsday prepper. It was so great, we wanted to share it with you here.
The interview was originally on Kara’s podcast ON with Kara Swisher. If you like it and want to hear more of Kara’s interviews with folks like Sam Altman, Reid Hoffman and others, you can find more episodes of ON with Kara Swisher here: https://link.chtbl.com/_XTWwg3k
RECOMMENDED YUA EPISODES
AI Myths and Misconceptions
The AI Dilemma
The Three Rules of Humane Tech
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_