In 'Practical Ethics,' Peter Singer provides a comprehensive introduction to applied ethics. The book delves into the principles of ethics, emphasizing the role of rationality in moral decision-making. Singer discusses a range of ethical issues including race, sex, ability, species, abortion, euthanasia, infanticide, embryo experimentation, animal rights, political violence, overseas aid, and environmental concerns. He advocates for a utilitarian approach, arguing that it offers a practical framework for addressing moral conflicts by maximizing well-being and reducing suffering. The book also explores the concept of effective altruism, urging readers to consider their moral obligations to assist those in extreme poverty and to act in ways that maximize the impact of their charitable actions. The third edition includes a new chapter on climate change, one of the most pressing ethical challenges of our time[1][2][4].
Derek Parfit's "Reasons and Persons" is a landmark work in contemporary philosophy, profoundly impacting discussions on personal identity, ethics, and rationality. Parfit challenges traditional notions of the self, arguing that our sense of personal identity is less coherent than we assume. He explores the implications of this for our moral obligations, particularly concerning future generations. The book delves into the complexities of decision-making under uncertainty, examining how we should weigh our present interests against the potential consequences of our actions for the future. Parfit's rigorous analysis and thought-provoking arguments have had a lasting influence on various fields, including ethics, political philosophy, and decision theory. His work continues to stimulate debate and inspire new research.
In this book, Toby Ord argues that humanity is in a uniquely dangerous period, which he terms 'the Precipice,' beginning with the first atomic bomb test in 1945. Ord examines various existential risks, including natural and anthropogenic threats, and estimates that there is a one in six chance of humanity suffering an existential catastrophe within the next 100 years. He advocates for a major reorientation in how we see the world and our role in it, emphasizing the need for collective action to minimize these risks and ensure a safe future for humanity. The book integrates insights from multiple disciplines, including physics, biology, earth science, computer science, history, anthropology, statistics, international relations, and moral philosophy[1][3][5].
How close are we to the end of humanity? Toby Ord, Senior Researcher at Oxford University’s AI Governance Initiative and author of The Precipice, argues that the odds of a civilization-ending catastrophe this century are roughly one in six. In this wide-ranging conversation, we unpack the risks that could end humanity’s story and explore why protecting future generations may be our greatest moral duty.
We explore:
• Why existential risk matters and what we owe the 10,000-plus generations who came before us
• Why Toby believes we face a one-in-six chance of civilizational collapse this century
• The four key types of AI risk: alignment failures, gradual disempowerment, AI-fueled coups, and AI-enabled weapons of mass destruction
• Why racing dynamics between companies and nations amplify those risks, and how an AI treaty might help
• How short-term incentives in democracies blind us to century-scale dangers, along with policy ideas to fix it
• The lessons COVID should have taught us (but didn’t)
• The hidden ways the nuclear threat has intensified as treaties lapse and geopolitical tensions rise
• Concrete steps each of us can take today to steer humanity away from the brink
—
Transcript: https://www.generalist.com/p/existential-risk-and-the-future-of-humanity-toby-ord
—
This episode is brought to you by Brex: The banking solution for startups.
—
Timestamps
(00:00) Intro
(02:20) An explanation of existential risk, and the study of it
(06:20) How Toby’s interest in global poverty sparked his founding of Giving What We Can
(11:18) Why Toby chose to study under Derek Parfit at Oxford
(14:40) Population ethics, and how Parfit’s philosophy looked ahead to future generations
(19:05) An introduction to existential risk
(22:40) Why we should care about the continued existence of humans
(28:53) How fatherhood sparked Toby’s gratitude to his parents and previous generations
(31:57) An explanation of how LLMs and agents work
(40:10) The four types of AI risks
(46:58) How humans justify bad choices: lessons from the Manhattan Project
(51:29) A breakdown of the “unilateralist’s curse” and a case for an AI treaty
(1:02:15) Covid’s impact on our understanding of pandemic risk
(1:08:51) The shortcomings of our democracies and ways to combat our short-term focus
(1:14:50) Final meditations
—
Follow Toby Ord
Website: https://www.tobyord.com/
LinkedIn: https://www.linkedin.com/in/tobyord
X: https://x.com/tobyordoxford?lang=en
Giving What We Can: https://www.givingwhatwecan.org/
—
Resources and episode mentions
—Books—
• The Precipice: Existential Risk and the Future of Humanity: https://www.amazon.com/dp/0316484911
• Reasons and Persons: https://www.amazon.com/Reasons-Persons-Derek-Parfit/dp/019824908X
• Practical Ethics: https://www.amazon.com/Practical-Ethics-Peter-Singer/dp/052143971X
—People—
• Derek Parfit: https://en.wikipedia.org/wiki/Derek_Parfit
• Carl Sagan: https://en.wikipedia.org/wiki/Carl_Sagan
• Stuart Russell: https://en.wikipedia.org/wiki/Stuart_J._Russell
—Other resources—
• DeepMind: https://deepmind.google/
• OpenAI: https://openai.com/
• Manhattan Project: https://en.wikipedia.org/wiki/Manhattan_Project
• The Unilateralist’s Curse and the Case for a Principle of Conformity: https://nickbostrom.com/papers/unilateralist.pdf
• The Nuclear Non-Proliferation Treaty (NPT), 1968: https://history.state.gov/milestones/1961-1968/npt
• The Blitz: https://en.wikipedia.org/wiki/The_Blitz
• Operation Warp Speed: https://en.wikipedia.org/wiki/Operation_Warp_Speed
—
Production and marketing by penname.co. For inquiries about sponsoring the podcast, email jordan@penname.co.