Keen On America

Can We Get To 2125? Humanity's Most Existential Threats Over the Next 100 Years

Sep 9, 2025
Gary F. Bengier, a writer, philosopher, and technologist, dives into humanity's biggest threats for the next century: climate change, nuclear war, and the rise of robots. He emphasizes the importance of concentrating on these core issues without getting sidetracked by lesser worries. Bengier warns of the interconnectedness of these challenges, where a nuclear conflict could hinder efforts to combat climate change. Ultimately, his pragmatic approach suggests that if we focus on these three threats, we might navigate to 2125, albeit with uncertainties looming.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Focus On Few Existential Risks

  • Gary Bengier argues tech acceleration means the next 100 years will be faster than the last.
  • He urges focusing on a short list of existential risks rather than many distractions.
INSIGHT

Two Waves Of AI Disruption

  • Bengier predicts two AI waves: current software changes jobs, a future robot-embedded wave could destroy many jobs.
  • He warns that ubiquitous robots building robot factories will create a structural economic shift.
INSIGHT

Skepticism About Machine Consciousness

  • Bengier doubts current AI is near human-like consciousness due to the 'hard problem' and embodiment needs.
  • He believes philosophical study shows qualia remain far from being replicated by machines.
Get the Snipd Podcast app to discover more snips from this episode
Get the app