The End Of The World with Josh Clark

End

Nov 30, 2018
Humanity faces unprecedented existential risks this century. The discussion dives into the dangers of nuclear weapons and the ethical dilemmas faced by scientists during their development. Philosophers weigh in on the moral urgency of addressing these threats and how public awareness can drive movement. They also emphasize the role of modern technology in amplifying risks. Space colonization is proposed as a long-term survival strategy, urging for informed public participation in scientific decision-making for a healthier future.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Humanity Faces An Unprecedented Window Of Risk

  • We live in a uniquely dangerous era where our technologies could cause human extinction within the next century or two.
  • If we fail, intelligent life may end with us, so the coming generations are the most important ever to act.
ANECDOTE

Trinity Test Showed How Small Decisions Risk All Life

  • The Trinity test in 1945 created the atomic age and revealed scientists feared the test might ignite Earth's atmosphere.
  • Arthur Compton treated that tiny risk as serious, reportedly setting a risk threshold before proceeding.
INSIGHT

Tiny Probabilities Are Catastrophic At Scale

  • Experts often convert tiny probabilities into seeming zero due to convenience, but stakes make even minuscule chances catastrophic.
  • Choosing acceptable risk for existential outcomes is subjective and dangerous when decided by a few.
Get the Snipd Podcast app to discover more snips from this episode
Get the app