Astral Codex Ten Podcast cover image

Contra The xAI Alignment Plan

Astral Codex Ten Podcast

00:00

Elon Musk's Maximally Curious AI

One of the leading alignment plans is to wait until we have slightly smarter than us AI, then ask it to solve alignment. This works best if the slightly smarter than me AI is following orders. I'm not sure how Musk's maximally curious AI helps do office work. There's going to be more of a disconnect between current easily tested applications and the eventual super intelligence that we need to get right.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app