Astral Codex Ten Podcast cover image

Contra The xAI Alignment Plan

Astral Codex Ten Podcast

00:00

The Waluigi Effect

It's very hard to get one to demonstrate a Waluigi Effect in real life. I don't see how switching to maximally curious AI would prevent this problem. Some AI companies are trying to give their AIs exactly our current values. In my dreams, AI would become some kind of super intelligent moral reasoner.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app