6min chapter

Bankless cover image

177 - AI is a Ticking Time Bomb with Connor Leahy

Bankless

CHAPTER

The AI Alignment Problem

The AI alignment problem or existential risk is really what I care about. This is that something could be so dangerous that an accident or a misuse could occur of such magnitude that it could threaten the continued existence of all humanity. As our technology gets better, you know, we develop more civics to hit the weapon,. We redevelop gunpowder, we develop stuff like this. The number, the damage that can be caused both on purpose and accidentally increases. And at some point, if we have powerful enough technology, we should expect where things are even worse than AGI.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode