Eliezer Yudkowsky can warn humankind that If Anyone Builds It, Everyone Dies and get on the New York Times bestseller list, but he won’t get upvoted to the top of LessWrong.
According to the leaders of LessWrong, that’s intentional. The rationalist community thinks aggregating community support for important claims is “political fighting”.
Unfortunately, the idea that some other community will strongly rally behind Eliezer Yudkowsky’s message while LessWrong “stays out of the fray” and purposely prevents mutual knowledge of support from being displayed, is unrealistic.
Our refusal to aggregate the rationalist community beliefs into signals and actions is why we live in a world where rationalists with double-digit P(Doom)s join AI-race companies instead of AI-pause movements.
We let our community become a circular firing squad. What did we expect?
Timestamps
00:00:00 — Cold Open
00:00:32 — Introducing Holly Elmore, Exec. Director of PauseAI US
00:03:12 — “If Anyone Builds It, Everyone Dies”
00:10:07 — What’s Your P(Doom)™
00:12:55 — Liron’s Review of IABIED
00:15:29 — Encouraging Early Joiners to a Movement
00:26:30 — MIRI’s Communication Issues
00:33:52 — Government Officials’ Reviews of IABIED
00:40:33 — Emmett Shear’s Review of IABIED
00:42:47 — Michael Nielsen’s Review of IABIED
00:45:35 — New York Times Review of IABIED
00:49:56 — Will MacAskill’s Review of IABIED
01:11:49 — Clara Collier’s Review of IABIED
01:22:17 — Vox Article Review
01:28:08 — The Circular Firing Squad
01:37:02 — Why Our Kind Can’t Cooperate
01:49:56 — LessWrong’s Lukewarm Show of Support
02:02:06 — The “Missing Mood” of Support
02:16:13 — Liron’s “Statement of Support for IABIED”
02:18:49 — LessWrong Community’s Reactions to the Statement
02:29:47 — Liron & Holly’s Hopes for the Community
02:39:01 — Call to Action
SHOW NOTES
PauseAI US — https://pauseai-us.org
PauseAI US Upcoming Events — https://pauseai-us.org/events
International PauseAI — https://pauseai.info
Holly’s Twitter — https://x.com/ilex_ulmus
Referenced Essays & Posts
* Liron’s Eliezer Yudkowsky interview post on LessWrong — https://www.lesswrong.com/posts/kiNbFKcKoNQKdgTp8/interview-with-eliezer-yudkowsky-on-rationality-and
* Liron’s “Statement of Support for If Anyone Builds It, Everyone Dies” — https://www.lesswrong.com/posts/aPi4HYA9ZtHKo6h8N/statement-of-support-for-if-anyone-builds-it-everyone-dies
* “Why Our Kind Can’t Cooperate” by Eliezer Yudkowsky (2009) — https://www.lesswrong.com/posts/7FzD7pNm9X68Gp5ZC/why-our-kind-can-t-cooperate
* “Something to Protect” by Eliezer Yudkowsky — https://www.lesswrong.com/posts/SGR4GxFK7KmW7ckCB/something-to-protect
* Center for AI Safety Statement on AI Risk — https://safe.ai/work/statement-on-ai-risk
OTHER RESOURCES MENTIONED
* Stephen Pinker’s new book on mutual knowledge, When Everyone Knows That Everyone Knows... — https://stevenpinker.com/publications/when-everyone-knows-everyone-knows-common-knowledge-and-mysteries-money-power-and
* Scott Alexander’s “Ethnic Tension and Meaningless Arguments” — https://slatestarcodex.com/2014/11/04/ethnic-tension-and-meaningless-arguments/
PREVIOUS EPISODES REFERENCED
Holly’s previous Doom Debates appearance debating the California SB 1047 bill — https://www.youtube.com/watch?v=xUP3GywD0yM
Liron’s interview with Eliezer Yudkowsky about the IABI launch — https://www.youtube.com/watch?v=wQtpSQmMNP0
---
Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate.
Support the mission by subscribing to my Substack at DoomDebates.com and to youtube.com/@DoomDebates, or to really take things to the next level: Donate 🙏
Get full access to Doom Debates at
lironshapira.substack.com/subscribe