The Theory of Anything cover image

Episode 49: AGI Alignment and Safety

The Theory of Anything

00:00

The Importance of Analogies in Artificial Intelligence

The way we even try to think about artificial intelligence today, it's always going to be narrow. A real AGI has some means of being able to learn anything. We're not doing anything super-intelligent in that space yet. And I've got this abstract idea of what AI is trying to implement here. That's all I really have for you right now. Any other questions or comments? Send them to jennifer.smith@mailonline.com.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app