
Toby Ord
Moral philosopher at Oxford University. Author of "The Precipice: Existential Risk and the Future of Humanity".
Top 10 podcasts with Toby Ord
Ranked by the Snipd community

38 snips
Nov 7, 2018 • 37min
Fermi Paradox
Ever wondered where all the aliens are? It’s actually very weird that, as big and old as the universe is, we seem to be the only intelligent life. In this episode, Josh examines the Fermi paradox, and what it says about humanity’s place in the universe. (Original score by Point Lobo.) Interviewees: Anders Sandberg, Oxford University philosopher and co-creator of the Aestivation hypothesis; Seth Shostak, director of SETI; Toby Ord, Oxford University philosopher. Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.

25 snips
Oct 3, 2021 • 3h 14min
One: Toby Ord on existential risks
In 2020, Oxford academic and 80,000 Hours trustee Dr Toby Ord released his book The Precipice: Existential Risk and the Future of Humanity. It's about how our long-term future could be better than almost anyone believes, but also how humanity's recklessness is putting that future at grave risk — in Toby's reckoning, a 1 in 6 chance of being extinguished this century.Toby is a famously good explainer of complex issues — a bit of a modern Carl Sagan character — so we thought this would be a perfect introduction to the problem of existential risks.Full transcript, related links, and summary of this interviewThis episode first broadcast on the regular 80,000 Hours Podcast feed on March 7, 2020. Some related episodes include:• #81 – Ben Garfinkel on scrutinising classic AI risk arguments• #70 – Dr Cassidy Nelson on the twelve best ways to stop the next pandemic (and limit COVID-19)• #43 – Daniel Ellsberg on the creation of nuclear doomsday machines, the institutional insanity that maintains them, & how they could be dismantledSeries produced by Keiran Harris.

16 snips
Dec 10, 2021 • 1h 9min
Humanity on the precipice (Toby Ord)
Humanity could thrive for millions of years -- unless our future is cut short by an existential catastrophe. Oxford philosopher Toby Ord explains the possible existential risks we face, including climate change, pandemics, and artificial intelligence. Toby and Julia discuss what led him to take existential risk more seriously, which risks he considers underrated vs. overrated, and how to estimate the probability of existential risk.

15 snips
Sep 8, 2023 • 3h 7min
#163 – Toby Ord on the perils of maximising the good that you do
Toby Ord, a moral philosopher from the University of Oxford and a pioneer of effective altruism, discusses the complexities of maximizing good in altruistic efforts. He warns against the dangers of an all-or-nothing approach, using the FTX fallout as a cautionary tale. Toby emphasizes the importance of integrity and humility in leadership and argues for a more balanced goal: 'doing most of the good you can.' He also explores the intricate relationship between utilitarian ethics and individual character, highlighting the nuanced nature of moral decision-making.

9 snips
Mar 7, 2020 • 3h 14min
#72 - Toby Ord on the precipice and humanity's potential futures
Toby Ord, a moral philosopher at Oxford and author of 'The Precipice,' discusses humanity's precarious future. He reveals a staggering 1 in 6 chance of extinction this century due to both natural and human-made risks. Toby highlights the threat of supervolcanoes over asteroids, the alarming underfunding of global safety agreements, and the existential risks posed by AI. He emphasizes the importance of proactive measures, long-term planning, and moral dialogue to ensure a thriving future for humanity.

7 snips
Oct 25, 2023 • 27min
Highlights: #163 – Toby Ord on the perils of maximising the good that you do
Toby Ord discusses the trade-offs of maximizing one metric and the risks of optimizing AI. We explore the concept of moral trade and the impact of virtue as a multiplier on projects. Also, learn about a collection of restored and unseen Earth photographs from the Apollo program.

7 snips
Nov 14, 2018 • 37min
Natural Risks
Guests on the podcast talk about the existential risks faced by humanity, including asteroid impacts, the potential for human colonization of other planets, the runaway greenhouse effect caused by climate change, and the dangers of intelligent algorithms evolving beyond human control.

Jul 15, 2021 • 1h 6min
How can we save the world? (with Toby Ord)
Toby Ord, a Senior Research Fellow at Oxford, dives into existential risks that threaten humanity's future, discussing topics from climate change to technological dangers like AI and engineered pandemics. He explains the distinction between catastrophic and existential risks, emphasizing that supervolcanoes may pose a more immediate threat than asteroids. The conversation also explores long-termism and our moral obligations to future generations, underscoring the need for global cooperation and innovative solutions to ensure a safe and secure world.

Apr 26, 2021 • 1h 12min
Existential Risks & the Future of Humanity | Toby Ord
Toby Ord, Senior Research Fellow in Philosophy at Oxford University, discusses existential risks and the future of humanity. Topics include nuclear war, geopolitical dimensions, pandemics, biological terrorism, artificial intelligence, asteroid impacts, climate change, and supervolcanic eruptions.

Jun 23, 2020 • 1h 5min
#208 — Existential Risk
Toby Ord, a philosopher at Oxford University focused on existential risks and effective altruism, dives deep into the future of humanity. He explores the moral biases shaping our views on distance and time, highlighting the psychology behind effective altruism. The conversation navigates the complexities of distinguishing natural threats from human-made ones, addressing dangers like nuclear war and pandemics. Ord reflects on the morality of altruism, balancing emotion with logic, and the essential responsibility we hold for future generations.