The Nonlinear Library cover image

The Nonlinear Library

EA - Evaluations from Manifund's EA Community Choice initiative by Arepo

Sep 17, 2024
14:35
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Evaluations from Manifund's EA Community Choice initiative, published by Arepo on September 17, 2024 on The Effective Altruism Forum.
My partner (who we'll refer to as 'they' for plausible anonymity), and I ('he') recently took part in Manifund's EA Community Choice initiative. Since the money was claimed before they could claim anything, we decided to work together on distributing the $600 I received.
I think this was a great initiative, not only because it gave us a couple of fun date nights, but because it demonstrated a lot of latent wisdom of the crowd sitting largely untapped in the EA community. Many thanks to Anonymous Donor, for both of these outcomes! This post is our effort to pay the kindness (further) forward.
As my partner went through the projects, we decided to keep notes on most of them and on the landscape overall, to hopefully contribute in our small way to the community's self-understanding. These notes were necessarily scrappy given the time available, and in some cases blunt, but we hope that even the recipients of criticism will find something useful in what we had to say.
In this post we've given just notes on the projects we funded, but you can see our comments on the full set of projects (including those we didn't fund) on
this spreadsheet.
Our process:
We had three 'date nights', where both of us went through the list of grants independently. For each, we indicated Yes, No, or Maybe, and then spent the second half our time discussing our notes. Once we'd placed everything into a yes/no category, we each got a vote on whether it was a standout; if one of us marked it that way it would receive a greater amount; if both did we'd give it $100.
In this way we had a three-tiered level of support: 'double standout', 'single standout', and 'supported' (or four, if you count the ones we didn't give money to).
In general we wanted to support a wide set of projects, partly because of the quadratic funding match, but mostly because with $600 between us, the epistemic value of sending an extra signal of support seemed much more important than giving a project an extra $10. Even so, there were a number of projects we would have liked to support and couldn't without losing the quasi-meaningful amounts we wanted to give to our standout picks.
He and they had some general thoughts provoked by this process:
His general observations
Despite being philosophically aligned with totalising consequentialism (and hence, in theory, longtermism), I found the animal welfare submissions substantially more convincing than the longtermist ones - perhaps this is because I'm comparatively sceptical of AI as a unique x-risk (and almost all longtermist submissions were AI-related); but they seemed noticeably less well constructed, with less convincing track records of the teams behind them. I have a couple of hypotheses for this:
The nature of the work and the culture of longtermist EA attracting people with idealistic conviction but not much practical ability
The EA funding landscape being much kinder to longtermist work, such that the better longtermist projects tend to have a lot of funding already
Similarly I'm strongly bought in to the narrative of community-building work (which to me has been unfairly scapegoated for much of what went wrong with FTX), but there wasn't actually that much of it here. And like AI, it didn't seem like the proposals had been thought through that well, or backed by a convincing track record (in this case that might be because it's very hard to get a track record in community building since there's so little funding for it - though see next two points).
Even so, I would have liked to fund more of the community projects - many of them were among the last cuts.
'Track record' is really important to me, but doesn't have to mean 'impressive CV/el...

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner