

#1434
Mentioned in 19 episodes
If Anyone Builds It, Everyone Dies
Book • 2025
This book delves into the potential risks of advanced artificial intelligence, arguing that the development of superintelligence could lead to catastrophic consequences for humanity.
The authors present a compelling case for the need for careful consideration and regulation of AI development.
They explore various scenarios and potential outcomes, emphasizing the urgency of addressing the challenges posed by rapidly advancing AI capabilities.
The book is written in an accessible style, making complex ideas understandable to a broad audience.
It serves as a call to action, urging policymakers and researchers to prioritize AI safety and prevent potential existential threats.
The authors present a compelling case for the need for careful consideration and regulation of AI development.
They explore various scenarios and potential outcomes, emphasizing the urgency of addressing the challenges posed by rapidly advancing AI capabilities.
The book is written in an accessible style, making complex ideas understandable to a broad audience.
It serves as a call to action, urging policymakers and researchers to prioritize AI safety and prevent potential existential threats.
Mentioned by
Mentioned in 19 episodes
Mentioned by 

as a mass market version of Eliezer's argument against building superhuman AI systems.


Kevin Roose

864 snips
Are We Past Peak iPhone? + Eliezer Yudkowsky on A.I. Doom
Mentioned by 

as the upcoming book by 

and 

on the dangers of superhuman AI.


Sam Harris


Eliezer Yudkowsky


Nate Soares

203 snips
#434 — Can We Survive AI?
Mentioned by 

as a book by Eliezer Yudkowsky and Nate Soros that is a call to arms to get humanity in action.


Josh Clark

57 snips
How Dolphins Work!
Mentioned by 

as a new book about AI.


Josh Clark

48 snips
Who are the Zizians?
Mentioned as the new book by Eliezer Yudkowsky and Nate Suarez, exploring the dangers of superhuman AI.

36 snips
Book Review: If Anyone Builds It, Everyone Dies
Mentioned by 

as a critical message on the dangers of AI, urging a shift in conversation towards the risk of building something that could kill everyone.


Liron Shapira

36 snips
This $85M-Backed Founder Claims Open Source AGI is Safe — Debate with Himanshu Tyagi
Recommended by 

as capturing the essence of AI existential risk.


Liron Shapira

31 snips
Tech CTO Has 99.999% P(Doom) — “This is my bugout house” — Louis Berman, AI X-Risk Activist
Mentioned by 

as co-author of a book with Nate Soros about the risk of building AGI.


Liron Shapira

30 snips
Rob Miles, Top AI Safety Educator: Humanity Isn’t Ready for Superintelligence!
Released by 

and 

, its message is fully condensed in that title.


Eliezer Yudkowsky


Nate Soares

28 snips
#434 - Can We Survive AI?
Mentioned as part of an upcoming book launch and discussion featuring Eliezer Yudkowsky.

26 snips
How AI Kills Everyone on the Planet in 10 Years — Liron on The Jona Ragogna Podcast
Mentioned by 

as the reason why he and Eliezer Yudkowsky wanted to have the AI conversation in the mainstream.


Nate Soares

15 snips
Why Building Superintelligence Means Human Extinction (with Nate Soares)
Referenced by 

to highlight the organization's perspective on AI risk.


Liron Shapira

11 snips
Q&A: Ilya's AGI Doomsday Bunker, Veo 3 is Westworld, Eliezer Yudkowsky, and much more!
Mentioned when discussing Eliezer Yudkowsky's views on AI and a review of the book in New Scientist.

Apple Unveils iPhone 17 Lineup, OpenAI & Oracle's $300B Cloud Pact | Nico Rosberg, Josh Machiz, Nir Zicherman, Amjad Masad, Alex Mashrabov, Stefan Cohen, Rohan Kodialam, Timothy Luchini, Marc Boroditsky, Michael Tannenbaum
Mentioned as receiving strong endorsements from scientists and academics.

“New Endorsements for ‘If Anyone Builds It, Everyone Dies’” by Malo
Mentioned by Max Tegmark as a book that plainly points out that there is no plan for controlling superintelligence.

Max Tegmark Says It's Time To Protest Against AI Companies
Mentioned by 

as the new book by Eliezer Yudkowsky and Nate Soares, set to launch soon.


Liron Shapira

Get ready for LAUNCH WEEK!!! “If Anyone Builds It, Everyone Dies” by Eliezer Yudkowsky & Nate Soares
Mentioned by 

as a book with endorsements from the national security community.


Nate Soares

The AI disconnect: understanding vs motivation, with Nate Soares
Mentioned by 

as the new book by Eliezer Yudkowsky being launched the day after the interview.


Liron Shapira

Eliezer Yudkowsky — If Anyone Builds It, Everyone Dies
Mentioned as a forthcoming book co-authored by Eliezer Yudkowsky and Nate Soares about AI existential risk.

“The Problem” by Rob Bensinger, tanagrabeast, yams, So8res, Eliezer Yudkowsky, Gretta Duleba
Mentioned by ![undefined]()

as a new book by one of the most prominent AI doomers.

Danny Fortson

OpenAI's iPhone moment & can AI teach?