Neglected Goals for Local EA Groups
Summary: EA organizations primarily keep track of and encourage the pursuit of direct work in their field, or donations they receive, as the valued outputs from local and university EA groups, and the whole community more broadly. Yet there are many other valuable goals for local and university EA groups to pursue that aren’t as measured or encouraged as they could or should be. Below I go through the considerations of why different goals for local EA groups to aim for are important, and why and how they could be measured or encouraged more through local EA groups and the EA community:
The quality and quantity of the number of EA community members generally earning to give or pursuing other EA-inspired careers.
Commitments and fundraisers for effective charitable giving through a variety of membership organizations such as Giving What We Can and The Life You Can Save.
The quality and quantity of projects started by local EA groups.
The contribution of local EA groups to network effects in the EA community.
Introduction
In EA, in addition to donations, the output from university or local EA groups that’s most commonly discussed is the talent driven to EA organizations. This makes sense, as if metacharities are often expanding, and the EA community serves as a steady stream of potential hires, this will drive the conversation. Unfortunately, this can cause other very valuable outputs from local EA groups to be overlooked.
To the credit of the Centre for Effective Altruism (CEA) and a lot of regional EA organizations around the world, a lot more and better theory on EA movement-building has come out in the last couple years. It just doesn’t appear most of it has been fully implemented. I think what’s missing is more evaluation and measurement of what are the most valuable products of local EA groups. Here are some things I expect a great number of effective altruists would think just as if not more valuable than the talent driven to EA organizations. Here are several things local EA groups can do, what makes them valuable, and suggestions for how to pursue them.
1. The number of earners-to-give. The number of effective altruists earning to give from a local EA group. I know a lot of effective altruists have followed a change from the CEA and 80,000 Hours (80k) in the last couple years of deprioritizing earning to give as an EA career selection. However, most effective altruists appear to have taken this change for granted. There was never much of a public conversation about this major shift in EA. I’ve seen more people bringing up the issue lately. In a conversation I had with Peter Hurford a few months ago, he said he wouldn’t be surprised if earning to give is a top career choice for 50% of effective altruists.
It’s commonly assumed earning to give was deprioritized because a lot of EA organizations started prioritization the long-term future, and existential risk reduction has much less annual room for more funding than, e.g., global poverty alleviation. Ergo, there is less a need for earning to give.
Yet I know organizations like the Machine Intelligence Research Institute (MIRI) have been consistently trying to grow throughout their whole lifetime. In the past, organizations like MIRI receiving more of their funding from independent donors has acted as social proof giving big funders like the Open Philanthropy Project (Open Phil) confidence to make bigger grants to MIRI. This hasn’t changed. As Open Phil caps their grants to EA organizations at 50% their annual budget, more independent donors donating more to EA organizations increases the absolute size of their budget, and thus the absolute size of the grants they may receive from funders like Open Phil as well.
Effective animal advocacy organizations also have limited sources of funding outside EA, and have huge funding needs, since factory farming is so daunting an issue. Earning to give perhaps remains the best option for effective altruists who prioritize farm animal welfare. Overall, the story that as a result of fewer effective altruists prioritizing global poverty alleviation, and charities like the Against Malaria Foundation that can absorb more money than effective altruists can throw at them, meaning earning to give should be a much lower priority doesn’t add up.
There are limits to how fast EA organizations can grow. Yet the fact is they could be competently be growing faster if they knew a reliable pool of funding from effective altruists was constantly growing as well. Many effective altruists who’ve been given the impression doing direct work is an optimal career choice have noticed there is a glut of talent available for the kinds of EA organizations they want to work for. Yet the talent bottleneck EA organizations face is often due to a lack of coordination within the relevant fields. It appears EA organizations don’t entirely know other that than they’re seeking more specialized or different talent than the EA community as a talent pool is currently providing. This leaves a lot of generally very talented effective altruists who might start their own projects in need of funding. Overall, it appears there are still abundant opportunities for donation that could be filled by more effective altruists pursuing earning to give as a career.
2. GWWC/TLYCS/etc. memberships. Even if an effective altruist isn’t earning to give, donating 10% of one’s annual income is within reach of enough people it can scale really well, especially if local EA groups can generate positive feedback loops for membership generation. Robert Wiblin recently wrote about how Giving What We Can membership growth has remained steady in spite of the fact the CEA has invested less effort into membership growth than in past years. On top of this, local EA groups figuring out how to drive membership growth could increase GWWC’s growth rate even further for little to no overhead.
In addition to the GWWC Pledge, local EA groups could try pledge drives for GWWC’s Try Giving option; pledge drives for or fundraisers with The Life You Can Save; or a pledge drive for Bolder Giving. IN general, in the last few years a bunch of apps to use, group commitments to make, and other activities effective altruists could do together to do more good in camaraderie have been created. Local EA groups can be a great way to spread all these things, but there is a lot less engagement in that regard in the EA community than there could be. This also ties into how earning to give has been deprioritized. In general, one of EA’s pillars has been a culture of personal giving which isn’t promoted as much as it used to be.
3. Quantity and Quality of Projects Generated: More local EA groups are producing more projects these days. Both the quantity and quality of all those different projects, and how valuable and successful they are, are all things to be assessed and measured. There are a variety of ways ‘project’ is defined, being anything from a handful of volunteers do in their spare time, to a fledgling professional organization. Yet whatever effective altruists might optimize for, there is more need of actually assessing the project output of local EA groups.
4. Effective Altruists Following Other Careers: Earning to give is just one career option. However, aside from earning to give and direct work at EA-aligned organized, other explicitly EA career trajectories haven’t been tracked as much in terms of who and how many of us are pursuing those careers. University and local EA groups could be one way to do this. 80k has recently reaffirmed their recommendation more of us enter the policy world, anticipating a near future in EA when this is very important. University EA groups might have a specialty in identifying departments in their university to advocate to enter public policy with regard to AI governance or another field of interest to EA, such as an economics or political science department especially sympathetic to EA values and goals. Local EA groups with a serial startup found wanting to help other effective altruists get into entrepreneurship especially sympathetic to the goals or value of EA could be tracked for what works and what doesn’t across a variety of communities. The sky is the limit.
5. Network Effects: I don’t think anyone would deny EA as a community has hit economies of scale which have transformed it into a resource pool organizations can turn to that is hard to define or measure in terms of the impact of only individuals alone. For example, public discussion and feedback online provides a sounding board for anyone in EA that may be difficult to find elsewhere. I think most people would say the value of the Effective Altruism Forum is greater than the sum of the karma scores of its most frequent users. Measuring the impact of network effects fro local EA groups seems to me it could be the most neglected of all these items. Yet it also seems the hardest to measure. I think better models and theories of how EA can or does successfully generate network effects must be made before we can meaningfully measure those network effects. Lots of different EA community members have written about their own models on the EA Forum and elsewhere. It would be great to see these different models evaluated, assessed, or developed into finer metrics for the assessment of network effects to be better tracked in EA.
Thanks for the post, I just wanted to provide some more context on CEA’s current evaluation of the outputs of local groups. Our main evaluation efforts are directed at the EA Community Building Grants program, and the primary metrics used to assess the grants given are career related outcomes (including group members doing relevant internship, applying for or taking on new roles etc.)
Our assessment of the grants is not limited to career related outcomes though. In retrospectively evaluating a grant we also look at a variety of other outputs including GWWC pledges, projects the group runs and indicators of building a sustainable community amongst other things.
Thanks for the feedback. It’s been my impression the CEA through community-building grants has become clearer in better ways than before. I think these things either aren’t talked about as much, or don’t permeate through the community as fast, so much of the community isn’t aware of these issues. I think clearly and consistently updating the community on changes is underrated, as it seems to have a dramatic impact on how many actors in EA make choices or allocate resources.
While I agree the current focus is too limited, I would generally advice against emphasising “general advice what to do everywhere” like the suggestions nr. 1 or 2. on global level, because of reasons mostly explained here. One sentence version is people greatly under-appreciate the differences caused by location.
(For example, while donating money may be a good option for someone working in fintech in the US, a philosophy postdoc in Prague can be earning about £1000/m.)
Meta-point is with increasing “distance” from Oxford and the Bay area, effective altruism groups need to do more of their own prioritization, and need to think more in terms of the “actual consequences” and less in terms of proxy metrices like number of etgs.
I agree “measuring projects” is important, but sees it’s not that neglected—for example in evaluations in EA community building grants, there is explicit space for this.
I have quite a lot of theory on network effects, in part explicit, but very little time to write some accessible explanations. Also writing is slow and painful process for me. If anyone would be interested in collaborating on this and doing most of the actual writing, I would be happy to share it.
I agree more independent local EA groups need to define success and its consequences for themselves. Using proxy metrices is also just a way of getting local EA groups to share some common ground since we can evaluate between them, e.g., for grant-making purposes, or so local EA groups have a template for what success looks like.
I would be interested in collaborating on this, and perhaps doing most of the actual writing, or at least quite a lot of it, as I don’t find writing to be as slow and painful a process.