AXRP - the AI X-risk Research Podcast cover image

12 - AI Existential Risk with Paul Christiano

AXRP - the AI X-risk Research Podcast

CHAPTER

Si, Human Potential in Expectation

Human potential in expectation equates to a 20% or 10% hit. So i think if you told me we'd definitely mess up a line an maximmly, then, like ni, are looking at a pretty big, close to hundred % drop. I wouldn't go all the way to hundred, it's notke literally as bad, probably, as a barren earth, but it's pretty bad. Supposing ai goes poorly, or there's some kind of accidential risk posed by some kind of, i guess, really bad a i. What do you imagine that looking like? Yes, i guess i think most often about a linement,. Although i do think there

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner