AI Safety Fundamentals: Alignment cover image

What Failure Looks Like

AI Safety Fundamentals: Alignment

00:00

The Unrecoverable Catastrophe of Automated Systems

An unrecoverable catastrophe would probably occur during some period of heightened vulnerability. Under these conditions, influenced seeking systems stop behaving in the intended way. It's hard to really pin down the level of systemic risk and mitigation may be expensive if we don't have a good technological solution.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app