For Humanity: An AI Safety Podcast cover image

For Humanity: An AI Safety Podcast

Episode #34 - “The Threat of AI Autonomous Replication” For Humanity: An AI Risk Podcast

Jun 26, 2024
01:17:23

In Episode #34, host John Sherman talks with Charbel-Raphaël Segerie, Executive Director, Centre pour la sécurité de l'IA. Among the very important topics covered: autonomous AI self replication, the potential for warning shots to go unnoticed due to a public and journalist class that are uneducated on AI risk, and the potential for a disastrous Yan Lecunnification of the upcoming February 2025 Paris AI Safety Summit.

 

Please Donate Here To Help Promote For Humanity

https://www.paypal.com/paypalme/forhumanitypodcast

This podcast is not journalism. But it’s not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. 

For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI.

Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We’ll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.

For Humanity Theme Music by Josef Ebner

Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg

Website: https://josef.pictures

RESOURCES:

Charbel-Raphaël Segerie’s Less Wrong Writing, much more on many topics we covered!

https://www.lesswrong.com/users/charbel-raphael

BUY STEPHEN HANSON’S BEAUTIFUL AI RISK BOOK!!!

https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom

JOIN THE FIGHT, help Pause AI!!!!

Pause AI

Join the Pause AI Weekly Discord Thursdays at 2pm EST

  / discord  

https://discord.com/invite/pVMWjddaW7

22 Word Statement from Center for AI Safety

Statement on AI Risk | CAIS

https://www.safe.ai/work/statement-on-ai-risk

Best Account on Twitter: AI Notkilleveryoneism Memes 

https://twitter.com/AISafetyMemes

TIMESTAMPS:

**The threat of AI autonomous replication (00:00:43)**

**Introduction to France's Center for AI Security (00:01:23)**

**Challenges in AI risk awareness in France (00:09:36)**

**The influence of Yann LeCun on AI risk perception in France (00:12:53)**

**Autonomous replication and adaptation of AI (00:15:25)**

**The potential impact of autonomous replication (00:27:24)**

**The dead internet scenario (00:27:38)**

**The potential existential threat (00:29:02)**

**Fast takeoff scenario (00:30:54)**

**Dangers of autonomous replication and adaptation (00:34:39)**

**Difficulty in recognizing warning shots (00:40:00)**

**Defining red lines for AI development (00:42:44)**

**Effective education strategies (00:46:36)**

**Impact on computer science students (00:51:27)**

**AI safety summit in Paris (00:53:53)**

**The summit and AI safety report (00:55:02)**

**Potential impact of key figures (00:56:24)**

**Political influence on AI risk (00:57:32)**

**Accelerationism in political context (01:00:37)**

**Optimism and hope for the future (01:04:25)**

**Chances of a meaningful pause (01:08:43)**


Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode