The Futurists cover image

A Smarter World

The Futurists

00:00

The Emerging Capabilities of Intelligence

The fear is a general intelligence that gets out of control or super intelligence. They all require that super intelligence to be stupid in one unique way like the paperclip maximizer which for the people are listening. So I think when I see these kind of scary things from both from everything the paper clip Maxifier. It's basically an AGI Python script gone wrong. All of these doom scenarios require that we assume some massive flaw or some massive blind spot. That's the extent that we're monkeying around with black boxes we don't have the blind spot that's a reality right now. We don't know emergent capabilities so far with large language models those were unpredictable, including writing code

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app