The Futurists cover image

A Smarter World

The Futurists

CHAPTER

The Emerging Capabilities of Intelligence

The fear is a general intelligence that gets out of control or super intelligence. They all require that super intelligence to be stupid in one unique way like the paperclip maximizer which for the people are listening. So I think when I see these kind of scary things from both from everything the paper clip Maxifier. It's basically an AGI Python script gone wrong. All of these doom scenarios require that we assume some massive flaw or some massive blind spot. That's the extent that we're monkeying around with black boxes we don't have the blind spot that's a reality right now. We don't know emergent capabilities so far with large language models those were unpredictable, including writing code

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner