UnHerd with Freddie Sayers cover image

UnHerd with Freddie Sayers

Nick Bostrom: How AI will lead to tyranny

Nov 10, 2023
Leading expert in AI, Nick Bostrom, discusses existential risks, loss of faith in institutions, and the future of AI. Explores potential dangers of AI including tyranny and the challenge of aligning AI with human values.
43:07

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • Existential risks encompass potential premature endings to humanity, from extinction to totalitarian dystopias.
  • Managing and aligning AI systems is crucial to ensure positive outcomes while continuing AI development.

Deep dives

What is existential risk?

Existential risk refers to ways that the human story could end prematurely, including literal extinction or getting locked into suboptimal states like collapse or totalitarian dystopias.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner