AXRP - the AI X-risk Research Podcast cover image

12 - AI Existential Risk with Paul Christiano

AXRP - the AI X-risk Research Podcast

00:00

Si, Human Potential in Expectation

Human potential in expectation equates to a 20% or 10% hit. So i think if you told me we'd definitely mess up a line an maximmly, then, like ni, are looking at a pretty big, close to hundred % drop. I wouldn't go all the way to hundred, it's notke literally as bad, probably, as a barren earth, but it's pretty bad. Supposing ai goes poorly, or there's some kind of accidential risk posed by some kind of, i guess, really bad a i. What do you imagine that looking like? Yes, i guess i think most often about a linement,. Although i do think there

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app