Astral Codex Ten Podcast cover image

Why I Am Not (As Much Of) A Doomer (As Some People)

Astral Codex Ten Podcast

00:00

The Importance of Alignment Research

Some people worry that, since training costs are so much higher than inference costs, we can afford to run hundreds of millions of copies. This is starting to sound more concerning and harder to bargain with. I'm optimistic because I think you can get AI's that do good alignment research before you getAI's that can do creepy a-causal bargaining. How much harder is it to solve the alignment problem than to check someone else's solution? It took Newton to invent calculus, but some high school is able to use calculus. Anyone who uses calculus can confirm that it correctly solves calculus problems.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app