AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Getting a Better Number Maybe Makes the World Better
There does seem to be something particularly unusual about the kinds of questions that like EA research involves. It's very hard to get feedback loops on the thing you ultimately care about most which is did we all die or did we get locked into some terrible future because if happens once you know it too late. You can make lots of nuclear risk relevant stuff happening all the time for example and you can have forecasts on that kind of thing so i'm not just saying 0.5 percent existential risk from nuclear weapons to 2100 are driving the project but i also have no sense of what's driving that instead rather than having a sort of meme template I think this would help people understand why they're doing