Angry Planet cover image

Angry Planet

The Cult of Rationalism in Silicon Valley

Mar 25, 2025
Max Read, journalist and author of the Reed Max Substack, unpacks the cult-like nature of rationalism in Silicon Valley. He reveals how rationalism, a blend of movement and self-help, shapes tech ideologies and influences decisions regarding artificial intelligence. Read discusses the dangers posed by superintelligent systems and critiques the oversimplified thinking prevalent in these communities. He emphasizes the moral ambiguities that arise in effective altruism, while questioning the belief that technology can solve complex human issues.
01:01:34

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • The rationalism movement in Silicon Valley promotes enhanced reasoning skills but can lead to dangerously extreme beliefs and actions.
  • Eliezer Yudkowsky significantly influences the discourse on AI, advocating for alignment between human values and superintelligent AI to avert disaster.

Deep dives

Understanding Rationalism

Rationalism is presented as a movement centered on the idea that individuals can enhance their reasoning skills to solve various problems. This broad community suggests that by striving to eliminate cognitive biases, people can approach issues more logically. While some aspects of rationalism align with cognitive behavioral therapy, there are fringe beliefs that can veer dangerously close to justifying extreme actions. A key figure within this movement is Eliezer Yudkowsky, who posits that humanity must align its values with a future superintelligent AI to avoid potential disaster.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner