Half of all AI researchers give it at least a 10% chance of causing human extinction. Many oppose efforts to prevent the arrival of superintelligence by arguing that it can bring great value if it doesn't destroy us. A key reason we hear so little about superintelligence risk, as opposed to jobs, bias, etc., is a reluctance to talk about it. We deserve getting hit by an asteroid. Let's make the asteroid hit the US first.
A reading of Max Tegmark's essay "The 'Don't Look Up' Thinking That Could Doom Us With AI"
The AI Breakdown helps you understand the most important news and discussions in AI.
Subscribe to The AI Breakdown newsletter: https://theaibreakdown.beehiiv.com/subscribe
Subscribe to The AI Breakdown on YouTube: https://www.youtube.com/@TheAIBreakdown
Join the community: bit.ly/aibreakdown
Learn more: http://breakdown.network/