CEA is fundraising, and funding constrained
The Centre for Effective Altruism (CEA) has an expected funding gap of $3.6m in 2024.
Some example things we think are worth doing but are unlikely to have funding for by default:
Funding a Community Building Grant in Boston
Funding travel grants for EAG(x) attendees
Note that these are illustrative of our current cost-effectiveness bar (as opposed to a binding commitment that the next dollar we receive will go to one of these things).
In collaboration with EA Funds we have produced models where users can plug in their own parameters to determine the relative value of a donation to CEA versus EA Funds.
The role of an interim executive is weird: whereas permanent CEOs like to come in with a bold new vision (ideally one which blames all the organization’s problems on their predecessor), interim CEOs are stuck staying the course. Fortunately for me, I mostly liked the course CEA was on when I came in.
The past few years seem to have proven the value of the EA community: my own origin cause area of animal welfare has been substantially transformed (e.g. as recounted by Jakub here), and even as AI safety has entered the global main stage many of the people doing research, engineering, and other related work have interacted with CEA’s projects.
Of course, this is not to say that CEA’s work is a slamdunk. In collaboration with Caleb and Linch at EA Funds, I have included below some estimates of whether marginal donations to CEA are more impactful than those to EA Funds, and a reasonable confidence interval very comfortably includes the possibility that you should donate elsewhere.
We are fortunate to count the Open Philanthropy Project (and in particular Open Phil’s GCR Capacity Building program) among the people who believe we are a good use of funding, but they (reasonably) prefer to not fund all of our budget, leaving us with a substantial number of projects which we believe would produce value if we could fund starting or scaling them.
This post outlines where we expect marginal donations to go and the value we expect to come from those donations.
You can donate to CEA here. If you are interested in donating and have further questions, feel free to email me (firstname.lastname@example.org). I will also try to answer questions in the comments.
The basic case for CEA
Community building is sometimes motivated by the following: suppose you spent a year telling everyone you know about EA and getting them excited. Probably you could get at least one person excited. Then this means that you will have doubled your lifetime impact, as both you and this other person will go on to do good things. That’s a pretty good ROI for one year of work!
This story is overly simplistic, but is roughly my motivation for working on (and donating to) community building: it’s a leveraged way to do good in the world.
And it does seem to be the case that many people whose work seems impactful attribute some of their impact to CEA:
The Open Philanthropy longtermist survey in 2020 identified CEA among the top tier of important influences on people’s journey towards work improving the long-term future, with about half of CEA’s demonstrated value coming through events (EA Global and EAGx conferences) and half through our other programs.
The 80,000 Hours user survey in 2022 identified CEA as the EA-related resource which has influenced the most people’s career plans (in addition to 80k itself), with 64% citing the EA Forum as influential and 44% citing EAG.
This selection of impact stories illustrates some of the ways we’ve helped people increase their impact by providing high-quality discussion spaces to consider their ideas, values and options for and about making impact, and connecting them to advisors, experts and employers. High-impact case studies include:
Someone initially unconvinced of EA ideas took part in the introductory virtual program, and by the end of the program had given notice to their employer with intent on finding an AI safety job, and they now work as a software engineer for an AI safety organization.
Someone who is on the technical staff on an AI safety team estimated that their EA group sped up his involvement in longtermist work by 2-6 years.
Six of the 17 people doing Training for Good’s EU tech policy fellowship came from the EA Forum.
Of the 64 top-scoring candidates for Charity Entrepreneurship’s program in 2022, 13 were found via EAG or EAGx, making it their second biggest source of top candidates.
Our financial runway currently extends until June 2024 (based on our current reserves, existing funding commitments, and budgeted spending). I’m ~80% confident that OP’s GCRCB program will fund us at roughly similar levels in 2024 as they did in 2023, meaning that where marginal funding on top of OP goes today is likely to be fairly similar to where it would go next year.
Therefore, I expect marginal funding that we raise from other donors (i.e. you) to most likely go to the following:
Community Building Grants: We would be interested in funding an organizer in Boston but currently don’t have the budget. (Approx cost: $110,000).
Travel grants for EA conference attendees. Many of the people we would like to attend our conferences are financially unable to do so; giving them some money to cover travel and accommodation ($350 average for EAGx, $1,100 on average for EAG) can result in them attending. We estimate that we could spend an additional $295,000/year here before hitting diminishing returns (from people for whom the grant wasn’t actually counterfactual, or whose impact we feel will be influenced less by our events).
EA Forum: I’m unable to succinctly describe the need here, and we will be doing a follow-up post solely devoted to Forum fundraising. But I do want to briefly mention that I would particularly like the Forum to have diverse funding since some of its value comes from hosting criticism of “powerful” people/organizations. I’m not aware of any instances of donors e.g. pressuring CEA to remove a post that’s critical of them, but “hey can you please be a predominant funder of this thing which is critical of you” feels like an uncomfortable pitch to have to make.
We are raising unrestricted funds, but if one of these projects (or something else) seems substantially more cost-effective to you than the others, I would be interested to hear that. Also please note that these are illustrative of our current cost-effectiveness bar (as opposed to a commitment that the next dollar we receive will go to one of these things).
Background about CEA
A quick reminder of what we do
Our major programs are:
Community Health & Special Projects
Hearing and investigating concerns from community members about misconduct or risks related to other people or organisations in the community
Reducing risks related to sensitive projects and risky actors
Conveying important and decision-relevant information to actors in the community
Identifying specific problems and finding specialists to work on them
Communications: Promoting and protecting EA and related ideas beyond the EA community, by engaging with the media and on social media
Our public dashboard includes key metrics for some of our core programs. Our website has more about our strategy (but it’s worth reiterating that being in an interim period means we’re more uncertain about our strategy in 2024 and beyond than we typically would be).
Our budget and funding gap
CEA’s total budget in 2023 is $31.4m, although our actual expenditures are ~20% below budget YTD.
|Community Health & Special Projects|
Inc. Exec Office, People Ops and other centralized infrastructure
We expect that at the beginning of 2024 our 2024 baseline budget will be similar (80% CI: $28.2m - $32.2m). Taking the low end of that estimate and assuming we receive 80% funding from Open Phil’s GCRCB program (which we have not yet secured), and an expected ~$2 million from other established donors, leaves an expected funding gap of $3.6M.
Should you donate?
There are a bunch of organizations which work on things similar to what CEA does, but my guess is that the most directly comparable alternative donation target for CEA is EA Funds, and in particular the EA infrastructure fund. So EA Funds tried to collaborate on a set of BOTECs to help donors decide where to donate (see side note below).
As with most cost-effectiveness analyses, reasonable disagreements in parameter settings can result in >10x differences in cost-effectiveness. So we are providing a few different models which you can plug your own parameters into.
These models tend to be fairly brief, as our goal is to convey a sense of the key considerations, and we think that a more detailed model would obscure more than enlighten. That being said: we would be very excited for people to build upon these models and share the results (possibly during donation debate week!). You should also note that the BOTECs don’t focus on marginal projects at CEA.
Huge thanks to Caleb and Linch from EA Funds for agreeing to collaborate on these models with me!
This is a nice comparison because the work is extremely comparable, but it’s a bad comparison because EAIF thinks this grant would be below their current funding bar. I made this BOTEC before I knew that fact, so I’m publishing it because having more data seems good, but caution is advised when interpreting the results. It at least shows that CEA is not substantially below the old EAIF funding bar.
Thanks to Zoe Williams for sharing some information about this grant and being willing to do a BOTEC of its effectiveness publicly.
Side note on the collaboration with EA Funds
Unfortunately, the EA Funds staff weren’t able to devote much time to these comparisons, so the work was mostly done by myself and Angelina Li (who is also at CEA). We’ve done our best to make them impartial, but you may wish to consider the origin of the models when evaluating them.
Here are some general points you may wish to consider, beyond the BOTECs:
If you generally agree with the priorities of major EA donors, and know about projects that they would like to fund but can’t for COI, PR, or other reasons, you might want to donate to those instead of CEA. (Similarly to how, if you and an acquaintance both wanted to donate to a certain kind of project, and only you could donate to the US-based charities because you were American, it might make sense for you to ignore the non-US-based charities.)
If you disagree with the priorities of major EA donors (notably, if you want CEA to be less GCR-focused than OP’s GCRCB team), you may wish to donate to CEA to preserve funder diversity.
OP has never, to my knowledge, asked CEA to stop doing something because it wasn’t GCR-focused enough, but they have declined to fund activities which are not GCR-focused enough, which de facto means that CEA will not do those things unless we get outside funding. However, it feels really hard for me to predict what OP will want to fund next year, so I hesitate to make this pitch very strongly, but it is something you may consider.
You may also care about funder diversity for other reasons: limiting the COI that CEA has when e.g. hosting criticism of OP, or increased robustness to funding shocks that affect OP.
If you think there are economies of scale, you might want to donate to CEA (as one of the larger community building organizations). Conversely, if you think there are diminishing returns, you might want to donate to something like EAIF or MCF.
You can donate to CEA here
Or comment any questions you have below
At the time of writing: last updated 13 November.
We also expect that our budget will change during the year: we have a midyear budget update process, and we may update our spending plans before then depending on the timing of a new CEO being in office and the success of our fundraising efforts.
Open Phil’s GCRCB program tries not to fund more than their “fair share” of organizations like CEA; this year it was 80%, and my understanding is that the future amount has not been decided but will depend on the amount of funding elsewhere in the meta GCR reduction ecosystem.
I’ve heard some feedback from people that understanding the squiggle code can become complicated if you haven’t done it before; Ozzie Gooen, one of the creators of squiggle, suggested joining the discord and said that they would be happy to help answer any questions people may have.