For Humanity: An AI Safety Podcast cover image

For Humanity: An AI Safety Podcast

Episode #25 TRAILER  - “Does The AI Safety Movement Have It All Wrong?” Dr. Émile Torres Interview, For Humanity: An AI Safety Podcast

Apr 22, 2024
02:53

DONATE HERE TO HELP PROMOTE THIS SHOW


Episode #25 TRAILER  - “Does The AI Safety Movement Have It All Wrong?” Dr. Émile Torres Interview, For Humanity: An AI Safety Podcast


In episode #25 TRAILER, host John Sherman and Dr. Emile Torres explore the concept of humanity's future and the rise of artificial general intelligence (AGI) and machine superintelligence. Dr. Torres lays out his view that the AI safety movement has it all wrong on existential threat.  Concerns are voiced about the potential risks of advanced AI, questioning the effectiveness of AI safety research and the true intentions of companies like OpenAI. Dr. Torres supports a full "stop AI" movement, doubting the benefits of pursuing such powerful AI technologies and highlighting the potential for catastrophic outcomes if AI systems become misaligned with human values or not. The discussion also touches on the urgency of solving AI control problems to avoid human extinction.


Émile P. Torres is a philosopher whose research focuses on existential threats to civilization and humanity. They have published widely in the popular press and scholarly journals, with articles appearing in the Washington Post, Aeon, Bulletin of the Atomic Scientists, Metaphilosophy, Inquiry, Erkenntnis, and Futures.


This podcast is not journalism. But it’s not opinion either. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. 


For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI.


Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We’ll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.


TIMESTAMPS:

Defining Humanity and Future Descendants (00:00:00) Discussion on the concept of humanity, future descendants, and the implications of artificial general intelligence (AGI) and machine superintelligence.

Concerns about AI Safety Research (00:01:11) Expressing concerns about the approach of AI safety research and skepticism about the intentions of companies like OpenAI.

Questioning the Purpose of Building Advanced AI Systems (00:02:23) Expressing skepticism about the purpose and potential benefits of building advanced AI systems and being sympathetic to the "stop AI" movement.

RESOURCES:

Emile Torres TruthDig Articles:

https://www.truthdig.com/author/emile-p-torres/

Emile Torres Latest Book:

Human Extinction (Routledge Studies in the History of Science, Technology and Medicine) 1st Edition

https://www.amazon.com/Human-Extinction-Annihilation-Routledge-Technology/dp/1032159065


Best Account on Twitter: AI Notkilleveryoneism Memes 

JOIN THE FIGHT, help Pause AI!!!!

Pause AI

Join the Pause AI Weekly Discord Thursdays at 3pm EST

  / discord  

22 Word Statement from Center for AI Safety

Statement on AI Risk | CAIS

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode