6min chapter

The Futurists cover image

A Smarter World

The Futurists

CHAPTER

The Emerging Capabilities of Intelligence

The fear is a general intelligence that gets out of control or super intelligence. They all require that super intelligence to be stupid in one unique way like the paperclip maximizer which for the people are listening. So I think when I see these kind of scary things from both from everything the paper clip Maxifier. It's basically an AGI Python script gone wrong. All of these doom scenarios require that we assume some massive flaw or some massive blind spot. That's the extent that we're monkeying around with black boxes we don't have the blind spot that's a reality right now. We don't know emergent capabilities so far with large language models those were unpredictable, including writing code

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode