Astral Codex Ten Podcast cover image

Why Not Slow AI Progress?

Astral Codex Ten Podcast

00:00

Introduction

Ai safety is the work of preventing a i from becoming dangerous. Some laps straddle the line between capabilities and safety research. Why isn't the rationalist slash e a slash a i safety movement doing this more? This is an audioversion of scott alexander's astral crodax ten. If you like it, you can subscribe on sub stack.

Play episode from 00:00
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app