

#499
Mentioned in 42 episodes
If Anyone Builds It, Everyone Dies
Book • 2025
This book delves into the potential risks of advanced artificial intelligence, arguing that the development of superintelligence could lead to catastrophic consequences for humanity.
The authors present a compelling case for the need for careful consideration and regulation of AI development.
They explore various scenarios and potential outcomes, emphasizing the urgency of addressing the challenges posed by rapidly advancing AI capabilities.
The book is written in an accessible style, making complex ideas understandable to a broad audience.
It serves as a call to action, urging policymakers and researchers to prioritize AI safety and prevent potential existential threats.
The authors present a compelling case for the need for careful consideration and regulation of AI development.
They explore various scenarios and potential outcomes, emphasizing the urgency of addressing the challenges posed by rapidly advancing AI capabilities.
The book is written in an accessible style, making complex ideas understandable to a broad audience.
It serves as a call to action, urging policymakers and researchers to prioritize AI safety and prevent potential existential threats.
Mentioned by























Mentioned in 42 episodes
Mentioned by 

as the upcoming book by 

and 

on the dangers of superhuman AI.


Sam Harris


Eliezer Yudkowsky


Nate Soares

348 snips
#434 — Can We Survive AI?
Recommended by 

as the most important book of the decade, calling out the lack of safety plans in AI development.


Max Tegmark

69 snips
“If Anyone Builds It, Everyone Dies” Party — Max Tegmark, Liv Boeree, Emmett Shear, Gary Marcus, Rob Miles & more!
Mentioned by 

as a book by Eliezer Yudkowsky and Nate Soros that is a call to arms to get humanity in action.


Josh Clark

57 snips
How Dolphins Work!
Mentioned by 

as a new book about AI.


Josh Clark

48 snips
Who are the Zizians?
Mentioned by ![undefined]()

, who heard several podcasts about this book and its arguments about AI risks.

Troy Young

48 snips
Age of Extremes
Released by 

and 

, its message is fully condensed in that title.


Eliezer Yudkowsky


Nate Soares

46 snips
#434 - Can We Survive AI?
Mentioned as the new book by Eliezer Yudkowsky and Nate Suarez, exploring the dangers of superhuman AI.

44 snips
Book Review: If Anyone Builds It, Everyone Dies
Mentioned as the focus of a "circular firing squad" within the rationalist community.

42 snips
Are We A Circular Firing Squad? — with Holly Elmore, Executive Director of PauseAI US
Mentioned by 

as a book coming out, possibly next week or this month.


Paul Kingsnorth

38 snips
Paul Kingsnorth: How to fight the Machine
Mentioned by 

as a critical message on the dangers of AI, urging a shift in conversation towards the risk of building something that could kill everyone.


Liron Shapira

36 snips
This $85M-Backed Founder Claims Open Source AGI is Safe — Debate with Himanshu Tyagi
Mentioned as a book written by Nate Soares and another Miri colleague that begs humanity to slam on the brakes regarding AI development.

32 snips
They warned about AI before it was cool. They're still worried
Anbefalet af ![undefined]()

, der beskriver den som velskrevet og sjov, men også som et syretrip om AIs mørkeste drømme.

Henrik Moltke

31 snips
AI slår os alle ihjel, Musks MUS-samtale, ChatGPT-alderskontrol og Zuckerberg gør det live
Recommended by 

as capturing the essence of AI existential risk.


Liron Shapira

31 snips
Tech CTO Has 99.999% P(Doom) — “This is my bugout house” — Louis Berman, AI X-Risk Activist
Mentioned by 

as co-author of a book with Nate Soros about the risk of building AGI.


Liron Shapira

30 snips
Rob Miles, Top AI Safety Educator: Humanity Isn’t Ready for Superintelligence!
Mentioned by 

as the reason why he and Eliezer Yudkowsky wanted to have the AI conversation in the mainstream.


Nate Soares

28 snips
Why Building Superintelligence Means Human Extinction (with Nate Soares)
Mentioned as part of an upcoming book launch and discussion featuring Eliezer Yudkowsky.

26 snips
How AI Kills Everyone on the Planet in 10 Years — Liron on The Jona Ragogna Podcast
Mentioned by 

as a new book co-authored by Nate Soros and Eliezer Yudkowsky about the dangers of superhuman AI.


Douglas Rushkoff

24 snips
Will AI Kill Us for the Lulz? Nate Soares: If Anyone Builds It, Everyone Dies
Mentioned by 

as a deeply terrifying book about AI that she finished listening to.


Krystal Ball

23 snips
9/25/25: ICE Shooting Details, Trump Pushes Comey Indictment, Italy Deploys Ships To Aid Gaza Flotilla
Mentioned by ![undefined]()

as a New York Times bestseller, discussing AI alignment.

Jim Rutt

23 snips
EP 325 Joe Edelman on Full-Stack AI Alignment
Launched last week and hit number seven in nonfiction, according to 

, also recommended by 

.


Liron Shapira


Wes Roth

20 snips
Wes & Dylan Join Doom Debates — Violent Robots, Eliezer Yudkowsky, & Who Has the HIGHEST P(Doom)?!