Astral Codex Ten Podcast cover image

Your Book Review: God Emperor Of Dune

Astral Codex Ten Podcast

00:00

The Secondary Risk of a I

By creating an a i that coddles us, we run the risk of never advancing as a race again. As june points out, soft environments tend to weaken rather than reinforce. How much better would it be if we could create an ai that restricts all other a ies, but only as a secondary goal? And amplifying those traits in a positive direction until we, as a still identifiably human race, can stand against a is on equal footing. I'm not sure if any of this is possible, even without the theoretical framework of a i risk. But i do know people at least a bit, and i know that the reflex to run from danger is often

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app