AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Effective Altruism and Measurement
I want to hurt as few people as possible and that includes myself. One of my intrinsic values is like not causing suffering, even if it's as a means to create a better situation later. I think there are many things that seem to be a benefit that either I haven't measured or I'm not even sure how you could measure. There's no measurement of kind and loving thoughts or feelings or actions in the world. And so that's really what I'm trying to optimize for in a way this is connected to EA. But I don't think they're measurable. I'm more trusting my own subjective assessment of these things. So how do you think about prioritization?