Foresight Institute Radio cover image

Anders Sandberg | Post-Scarcity Civilizations & Cognitive Enhancement

Foresight Institute Radio

00:00

Is There a Possible Alignment With AGI?

David Wheeler says he has proof that there is no possibility of alignment in the general case even for slow takeoff. The real deep question I think is, do you have things like the for takeoff scenario where alignment has to be really perfect or everything is lost? We don't know it. But if we look at AGI, the humans can make and the AGI has been put to when they work in the environment of earth surface, etc. In that case, I think we should get good enough things. Some visionaries who say don't worry about alignment with AGI because we will become the AGI. Don't worry about robots taking over the world. We will be augmented and enhanced

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app