Doom Debates cover image

Doom Debates

AI Doom Debate: Tilek Mamutov vs. Liron Shapira

Jul 26, 2024
01:44:39

Tilek Mamutov is a Kyrgyzstani software engineer who worked at Google X for 11 years before founding his own international software engineer recruiting company, Outtalent.

Since first encountering the AI doom argument at a Center for Applied Rationality bootcamp 10 years ago, he considers it a serious possibility, but he doesn’t currently feel convinced that doom is likely.

Let’s explore Tilek’s worldview and pinpoint where he gets off the doom train and why!

00:12 Tilek’s Background

01:43 Life in Kyrgyzstan

04:32 Tilek’s Non-Doomer Position

07:12 Debating AI Doom Scenarios

13:49 Nuclear Weapons and AI Analogies

39:22 Privacy and Empathy in Human-AI Interaction

39:43 AI's Potential in Understanding Human Emotions

41:14 The Debate on AI's Empathy Capabilities

42:23 Quantum Effects and AI's Predictive Models

45:33 The Complexity of AI Control and Safety

47:10 Optimization Power: AI vs. Human Intelligence

48:39 The Risks of AI Self-Replication and Control

51:52 Historical Analogies and AI Safety Concerns

56:35 The Challenge of Embedding Safety in AI Goals

01:02:42 The Future of AI: Control, Optimization, and Risks

01:15:54 The Fragility of Security Systems

01:16:56 Debating AI Optimization and Catastrophic Risks

01:18:34 The Outcome Pump Thought Experiment

01:19:46 Human Persuasion vs. AI Control

01:21:37 The Crux of Disagreement: Robustness of AI Goals

01:28:57 Slow vs. Fast AI Takeoff Scenarios

01:38:54 The Importance of AI Alignment

01:43:05 Conclusion

Follow Tilek

x.com/tilek

Links

I referenced Paul Christiano’s scenario of gradual AI doom, a slower version that doesn’t require a Yudkowskian “foom”. Worth a read: What Failure Looks Like

I also referenced the concept of “edge instantiation” to explain that if you’re optimizing powerfully for some metric, you don’t get other intuitively nice things as a bonus, you *just* get the exact thing your function is measuring.



This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit lironshapira.substack.com

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode