EA Funds organisational update: Open Philanthropy matching and distancing
We want to communicate some changes that are happening at EA Funds, particularly on the EA Infrastructure Fund and the Long-Term Future Fund.
In summary:
EA Funds (particularly the EAIF and LTFF) and Open Philanthropy have historically had overlapping staff, and Open Phil has supported EA Funds, but we (staff at EA Funds and Open Philanthropy) are now trying to increase the separation between EA Funds and Open Philanthropy. In particular:
The current chairs of the LTFF and the EAIF, who have also joined as staff members at Open Philanthropy, are planning to step down from their respective chair positions over the next several months. Max Daniel is going to step down as the EAIF’s chair on August 2nd, and Asya Bergal is planning to step down as the LTFF’s chair in October.
To help transition EA Funds away from reliance on Open Philanthropy’s financial support, Open Philanthropy is planning to match donations to the EA Infrastructure and Long-Term Future Fund at 2:1 rates, up to $3.5M each, over the next six months.
The EAIF and LTFF have substantial funding gaps—we are looking to raise an additional $3.84M for the LTFF and $3.6M for the EAIF. over the next six months. By default, I expect, the LTFF to have ~$720k, and the EAIF to have ~$400k by default.
Our relationship with Open Philanthropy
EA Funds started in 2017 and was largely developed during CEA’s time at Y Combinator. It spun out of CEA in 2020, though both CEA and EA Funds are part of the Effective Ventures Foundation. Last year, EA Funds moved over $35M towards high-impact projects through the Animal Welfare Fund (AWF), EA Infrastructure Fund (EAIF), Global Health and Development Fund (GHDF), and Long-Term Future Fund (LTFF).
Over the last two years, the EAIF and LTFF used some overlapping resources with Open Philanthropy in the following ways:
-
Over the last year, Open Philanthropy has contributed a substantial proportion of EAIF and LTFF budgets and has covered our entire operations budget.[1] They also made a sizable grant in February 2022. (You can see more detail on Open Philanthropy’s website.)
-
The chairs of the EAIF and LTFF both joined the Longtermist EA Community Growth team at Open Philanthropy and have worked in positions at EA Funds and Open Philanthropy simultaneously. (Asya Bergal joined the LTFF in June 2020, has been chair since February 2021, and joined Open Philanthropy in April 2021; Max Daniel joined the EAIF in March 2021, has been chair since mid-2021, and joined Open Philanthropy in November 2022.)
-
As a board member of the Effective Ventures Foundation (UK), Claire Zabel, who is also the Senior Program Officer for EA Community Growth (Longtermism) at Open Philanthropy and supervises both Asya and Max, has regularly met with me throughout my tenure at EA Funds to hear updates on EA Funds and offer advice on various topics related to EA Funds (both day-to-day issues and higher-level organisation strategy).
That said, I think it is worth noting that:
The majority of funding for the LTFF has come from non-Open Philanthropy sources.
Open Philanthropy as an organisation has limited visibility into our activities, though certain Open Philanthropy employees, particularly Max Daniel and Asya Bergal, have a lot of visibility into certain parts of EA Funds.
Our grants supporting our operations and LTFF/EAIF grantmaking funds have had minimal restrictions.
Since the shutdown of the FTX Future Fund, Open Phil and I have both felt more excited about building a grantmaking organisation that is legibly independent from Open Phil. Earlier this year, Open Phil staff reached out to me proposing some steps to make this happen, and have worked with me closely on the changes listed below.
We think this could help to:
-
Increase the diversity of perspectives in the funding ecosystem.
-
Decrease the reliance of small to medium-sized projects on a single funder.
-
Increase people’s willingness to disagree with Open Philanthropy.[2]
-
Make EA Funds internally feel less beholden to Open Phil.
I have had several conversations with fund managers who believed that we should be weighing Open Phil’s views more heavily in funding decisions than I or the Open Philanthropy grantmaker evaluating EA Funds believed was necessary.
-
Allow EA Funds to pursue activities that Open Philanthropy thinks are less promising than our current work, but we believe to be highly promising.
-
Decrease the risk of EA Funds’ actions negatively affecting Open Philanthropy or vice-versa.
Over the next few months, we and Open Philanthropy are making the following changes to make EA Funds a more legibly independent funder.
-
The LTFF and EAIF chairs will step down from their current roles, and we are unlikely to allow people to simultaneously chair a fund at EA Funds and work at Open Philanthropy.[3]
-
Instead of giving us a fixed grant, Open Philanthropy has decided to match donations at a 2:1 rate ($2 from Open Philanthropy for every $1 donated to the EAIF and LTFF) for the next six months, for up to $3.5M in Open Phil support per fund. You can find more information on the donation match in the appendix “Open Philanthropy donation matching”.
-
Instead of meeting with Claire Zabel, I will meet with another member of the Effective Ventures Board who doesn’t work at Open Philanthropy.
Overall, I believe the upcoming changes will be beneficial, but there are some drawbacks. In particular:
EA Funds will need to spend a much larger proportion of its time fundraising.[4]
I think it’s likely that the EAIF and LTFF will have substantial funding gaps, meaning we’ll need to reject a number of applications that we think are promising and are above the bars of others in the space(though we may try to refer those applications to other funders).
I have found hiring fund managers challenging in the past, and I expect replacing the current LTFF and EAIF chairs to be challenging.[5]
Comments from Open Philanthropy on the planned changes
This section was written by Claire Zabel
Our goal is to help the EAIF and LTFF become less dependent on Open Phil’s perspective on them or evaluation of them, while also giving them time to seek out and build relationships with other supporters while continuing to fund strong applicants.
That means the purpose of these matching funds is to amplify the impact of non-Open Phil people who evaluate the LTFF and/or EAIF and believe them to be good donation opportunities. The matching funds, thus, should not necessarily be interpreted as a strong endorsement of the Funds, especially going forwards, though they are premised on our past experience with the Funds suggesting that they have reasonable processes and historically supported projects we often thought seemed valuable but didn’t encounter ourselves.
We are trying to strike the right balance between encouraging different grantmaking projects in the spaces we work in to guard against our own potential blindspots, and conserving our support for the projects that seem best to us.
Counterfactually, the funding supporting this match would likely be used for other grantmaking projects at Open Phil in our longtermism grantmaking portfolio.
Current Funding Gaps
The EAIF and LTFF have received generous donations from many individuals in the EA community. However, donations to the EAIF and LTFF have been in decline over the last year. We think that some of this is due to crypto and tech stocks doing less well than the previous year (though we hope that recent market trends will bring back some donors).[6]
LTFF funding gap
-
The LTFF has a funding gap of $1M/month.[7]
-
Based on donations over the past few months, I estimate that each fund will receive (by default) roughly $120k per month (720K over the next six months), which will be matched at a 2:1 rate, by Open Philanthropy to give us a total of $360k/month.
-
This means we expect to be unable to fund around $640k/month of projects we believe should be funded.
-
This could be filled by an additional $213k in public donations each month (or $1.27M over the next six months).
EAIF funding gap
-
The EAIF has a funding gap of $800k/month.[8]
-
Based on donations over the past few months, I estimate that each fund will receive (by default) roughly $67k per month ($402k over six months), which will be matched at a 2:1 rate, by Open Philanthropy to give us a total of $200k/month.
-
This means we expect to be unable to fund around $600k/month of projects we believe should be funded
-
This could be filled by an additional $200k in public donations each month (or $1.2M over the next six months).
Fund | Expected shortfall (6mo) | Additional donations (6mo) |
---|---|---|
LTFF | $3.84M | $1.27M |
EAIF | $3.6M | $1.2M |
Appendix: Open Philanthropy donation matching
Open Philanthropy is planning to match public donations at a 2:1 rate, meaning that they will give $2 for every $1 fundraised for the EAIF and LTFF (contributing up to $3.5M per fund) for six months, starting from the publication of this post. This is subject to several conditions being met, indicating that the relevant funds are operating broadly similarly to how they have been doing in the past; we expect these to be met and will flag if things change such that additional funding might no longer be matched
Open Phil won’t match funds from other funders who contribute >$5M in aggregate to the two funds per year, or seem to contribute >$20M/yr to the greater EA/LT/x-risk reduction space.[9] The LTFF and EAIF can do things that don’t meet the conditions above, they just will not be subject to the default 2:1 match.
Open Philanthropy is not committing to matching funds on longer timescales at this time, though it may do so, plausibly at a lower match rate, given that in the future, we will have had more time to build relationships with other funders.
- ↩︎
Note that EA Funds staff who were also Open Phil employees did their Funds work on a volunteer basis.
- ↩︎
Which I believe will be beneficial for both Open Philanthropy and the epistemics of EA and Longtermist communities.
- ↩︎
Though this should not be taken as a strong commitment at this time.
- ↩︎
Which is also the case for many other projects after the FTX Future Fund shut down.
- ↩︎
My understanding is that Open Philanthropy and other grantmakers have also had difficulties hiring excellent fund managers. Some of the reasons that we (EA Funds) have found this hard include:
Limited opportunities for hands-on experience: EA cause areas provide limited opportunities for individuals to gain experience in grantmaking.
Insufficient management capacity: EA Funds lacks sufficient management capacity from experienced grantmakers, leading to potential operational inefficiencies.
High opportunity cost for potential grantmakers: Many prospective grantmakers have many alternative options for their time and could contribute to other valuable projects instead.
Perceived career limitation due to EA affiliation: The strong association with EA might discourage some potential candidates, as they worry it could limit their future career opportunities.
- ↩︎
I have also invested relatively little time in stewarding donors—instead, prioritising increasing the number of promising applications and improving our grantmaking.
- ↩︎
In the sense that we are confident we can distribute this amount of funding to projects we deem to be above the current funding bar.
- ↩︎
In the sense that we are confident we can distribute this amount of funding to projects we deem to be above the current funding bar.
- ↩︎
From Claire Zabel—“We think this would run the risk of making Funds more dependent on current major donors, which goes against our intent.”.
- EA Infrastructure Fund’s Plan to Focus on Principles-First EA by 6 Dec 2023 3:24 UTC; 224 points) (
- What I would do if I wasn’t at ARC Evals by 5 Sep 2023 19:19 UTC; 217 points) (LessWrong;
- Reflections on my time on the Long-Term Future Fund by 2 Aug 2023 1:32 UTC; 179 points) (
- What Does a Marginal Grant at LTFF Look Like? Funding Priorities and Grantmaking Thresholds at the Long-Term Future Fund by 10 Aug 2023 20:11 UTC; 175 points) (
- LTFF and EAIF are unusually funding-constrained right now by 29 Aug 2023 23:56 UTC; 168 points) (
- Observations on the funding landscape of EA and AI safety by 2 Oct 2023 9:45 UTC; 136 points) (
- What I would do if I wasn’t at ARC Evals by 6 Sep 2023 5:17 UTC; 130 points) (
- Long-Term Future Fund: April 2023 grant recommendations by 2 Aug 2023 1:31 UTC; 107 points) (
- Hypothetical grants that the Long-Term Future Fund narrowly rejected by 15 Nov 2023 19:39 UTC; 95 points) (
- LTFF and EAIF are unusually funding-constrained right now by 30 Aug 2023 1:03 UTC; 90 points) (LessWrong;
- Long-Term Future Fund: April 2023 grant recommendations by 2 Aug 2023 7:54 UTC; 81 points) (LessWrong;
- Long-Term Future Fund: May 2023 to March 2024 Payout recommendations by 12 Jun 2024 13:46 UTC; 75 points) (
- A plea for more funding shortfall transparency by 7 Aug 2023 21:33 UTC; 73 points) (LessWrong;
- What do Marginal Grants at EAIF Look Like? Funding Priorities and Grantmaking Thresholds at the EA Infrastructure Fund by 12 Oct 2023 21:40 UTC; 70 points) (
- Meet the candidates in the Forum’s Donation Election (2023) by 28 Nov 2023 14:23 UTC; 67 points) (
- Manifund: What we’re funding (weeks 2-4) by 4 Aug 2023 16:00 UTC; 65 points) (
- Long-Term Future Fund Ask Us Anything (September 2023) by 30 Aug 2023 23:02 UTC; 64 points) (
- How would your project use extra funding? (Marginal Funding Week) by 14 Nov 2023 16:58 UTC; 53 points) (
- Manifund: What we’re funding (weeks 2-4) by 4 Aug 2023 16:00 UTC; 44 points) (LessWrong;
- EA Infrastructure Fund: June 2023 grant recommendations by 26 Oct 2023 0:35 UTC; 40 points) (
- Long-Term Future Fund: May 2023 to March 2024 Payout recommendations by 12 Jun 2024 13:46 UTC; 40 points) (LessWrong;
- EA Infrastructure Fund Ask Us Anything (January 2024) by 12 Jan 2024 18:36 UTC; 35 points) (
- Long-Term Future Fund Ask Us Anything (September 2023) by 31 Aug 2023 0:28 UTC; 33 points) (LessWrong;
- EA Infrastructure Fund’s Plan to Focus on Principles-First EA by 6 Dec 2023 3:24 UTC; 27 points) (LessWrong;
- EA Infrastructure Fund: June 2023 grant recommendations by 26 Oct 2023 0:35 UTC; 21 points) (LessWrong;
- What do Marginal Grants at EAIF Look Like? Funding Priorities and Grantmaking Thresholds at the EA Infrastructure Fund by 12 Oct 2023 21:40 UTC; 20 points) (LessWrong;
- End of Year Charitable Fund Updates Are Great by 3 Aug 2024 13:09 UTC; 19 points) (
- EA Organization Updates: February 2024 by 15 Feb 2024 16:58 UTC; 14 points) (
- 31 Dec 2023 1:54 UTC; 12 points) 's comment on EA Infrastructure Fund’s Plan to Focus on Principles-First EA by (
- 21 Dec 2023 20:39 UTC; 11 points) 's comment on LTFF and EAIF are unusually funding-constrained right now by (
- 21 Dec 2023 20:40 UTC; 7 points) 's comment on EA Funds organisational update: Open Philanthropy matching and distancing by (
- 10 Aug 2023 11:03 UTC; 5 points) 's comment on Alignment Grantmaking is Funding-Limited Right Now [crosspost] by (
- 4 Aug 2023 19:44 UTC; 4 points) 's comment on Open Thread: July—September 2023 by (
I don’t know if this will work, but I’m pretty interested in making the funding ecosystem less unipolar. Thanks for taking a big risk here and doing a weird, but to my mind, good thing.
I have pretty mixed feelings about this. I greatly appreciate and respect the constrains you are operating under and the hard, often thankless work of those involved.
I have a lot of thoughts about this update and the funds in general based on my direct experience as an applicant, from being involved with a number of projects and applications and from what I have heard from others.
I might find the courage to say more about this publicly at some point but I did want to say that my experience with EA Funds, is a significant part of my decision to shut down AI Safety Support.
I’m sorry you had a negative experience as an applicant. If you would like to share feedback on the LTFF feel free to message me on the forum (I’ve also reached out over email).
Thanks for this update.
Where will funding for operations come from now? (And how much does EA Funds spend on operations?)
I’m not sure right now, we might fundraise for this separately, charge the individual funds or ask a small set of private donors. I think our operations costs come out to around ~$700k/year (but a large fraction of that are variable costs so if we are moving less money we’ll also end up having lower overheads).
Does the figure for operations costs include EA Funds’ share of general EVF operations overhead (e.g., HR, legal)?
EA Funds pays EV for ops support, otherwise it’d be a lot cheaper (Just Caleb’s salary + occasional contractors).
So you estimate 2.4M in donations this year and 700K operations? How was that ratio last year? And how much do you expect to decrease operations if donations remain low/decline?
LTFF made ~10M in grants last year (I think similar figures for EAIF). The 700k figure is from last year rather than a projection going forwards. So <4% operational overhead.
Also, to be clear, most of the operational costs are on the grantmaking side, not on the donation side. It will be extremely inaccurate to model EA Funds as primarily doing donation forwarding.
I’m not sure which number you are quoting here, but you can find the specific numbers here.
Some things to note
* our money granted is much higher than our money donations from the public figure as we received money from institutional donors like Open Phil.
* operations costs track grantmaking more than donations as ~ all of the variable costs are incurred through grantmaking so our operations costs should decrease naturally if we are doing less grantmaking
* operations costs are for all four EA Funds funds, not just eaif and ltff
Thanks! I’m multiplying the quoted numbers in the OP.
Just to be clear, I’m not anti-overhead (often the opposite), and 4% is low. @calebp I wrongfully thought this was just EAIF, not all funds, so thank you for making me aware of my mistake. It’s interesting to learn that most of the operational cost goes to the grant making side and I’m happy it’s on that side. Thanks and good luck with the funds!
Thanks for the update, Caleb!
Have you publicised grantmaking positions in the past? Having a public call might attract more candidates.
To which extent do you think some grantmaking tasks could be performed by someone with little/no grantmaking experience, but general knowledge about EA (e.g. like me)? It looks like, as of now, each grantmaker assesses one grant from end to end, but it might be the case that some tasks can be outsourced.
UPDATE 2023/12/21: Open Phil’s $3.5M donation matching for the Long-Term Future Fund has now been filled[1]. So your donations to LTFF will no longer be matched. That said, this fundraising post was written 4 months ago, and we’d like to continue fundraising (especially given that December is an unusually good time to fundraise).
Open Phil’s donation matching for the EA Infrastructure Fund has not been filled (currently $1.3M/$3.5), and my current projection is that by default they won’t be filled by the deadline (end of Jan 2024). So to the extent that you’re fairly indifferent between LTFF and EAIF, and believe that either fund is a better use of marginal resources than OP’s marginal dollar, it might make more sense to donate to EAIF than LTFF.[2]
(I’m the main person in charge of public comms for EA Funds)
How does this change affect the eligibility of near-term applicants to LTFF/EAIF (e.g., those who apply in the next 6 months) who have received OpenPhil funds in the past / may receive funds from OpenPhil in the future? Currently my understanding is that these applicants are ineligible for LTFF/EAIF by default – does this change if EA funds and Open Philanthropy are more independent?
Thanks for the update! This kind of information is helpful for planning. I’d also love to see projections/simulations from different organizations about future funding. Such predictions would be prone to high variance error, but I bet the models would be better than mine.
At the moment, I’m leaning towards keeping my safety work on pretty low hours (which provides more benefit per hour than working full time in my experience so far) while pursuing opportunities for high throughput earning to give. I’m concerned/alarmed that there are a significant number of possible worlds where my individual earnings would swamp the current budget; that seems like a bad sign from an evidence-of-sufficient-coordination standpoint.
I like seeing this. It’s a great thing to try out and seems like a good idea overall, mostly to decrease the unipolarity of the funding ecosystem. Having no overlap in people who work for Open Phil and help out with EAIF/LTFF seems really important. Though I don’t think there’s any harm for Open Phil to continue matching donations after the six months, but it definitely doesn’t have to be 2:1. 0.5:1 would already be great. I have low confidence in these opinions, just quick thoughts.
FWIW, this is not the current plan – the post mentions a desire to avoid having the same person chair a fund at EA Funds while at OPP, but not about someone being a fund manager at EA Funds while at OPP (though presumably this would only be a minority of fund managers).
I would like to add that it might be important to communicate this in an email to all currently funded projects by EAIF/LTFF ;)
Wouldn’t it make sense to find replacements before the chairs step down, to prevent a lack of capacity during the transition period ?
(Not sure if this is the plan, but it sounds like it is from your post)
I would like to offer a suggestion for how to manage the potential funding gaps from the perspective of a grassroots volunteer organizer that might apply to the funds.
Good project organizers with uncertain funding often have several contingent versions of their project that they can implement at various funding levels.
The application process does this already to some extent, but the fund could more aggressively respond to a project with a “We like the project, we have funding gaps, can you send us a version of the project at 1⁄3 of the initially proposed funding level.” From my perspective as an applicant, this would be a perfectly fine, positive response as an organizer.
Also as a small-scale EA-minded organizer, I am constantly trying to increase the “impact- productivity” of my projects (i.e. impact per donated dollar). This can be done by leveraging and cultivating non-cash resources (e.g. volunteer labor) or selectively cutting the least productive component costs of a project. If your interaction with applicants can help preserve the most productive pieces of the applicant projects during times of low funding, then by keeping more good projects in your funding pool (albeit at lower per-project funding), then the projects can increase their “impact productivity” over time.
The impact of the EA movement grows over time both through funding growth, and increasing the productivity and efficiency of the funds spent. IMHO, as much effort should be spent increasing the productivity of donated fund spending, as is spent marketing for increased donations. EA donors are a pretty limited pool. Many donations that might be received could counterfactually have been spent on other high priority EA causes.
Maybe I missed it, but I did not see in the post any mention of how funding productivity might be increased in response to some of the forecasted funding limitations.
Thanks for the suggestion. A decent fraction of applicants already outline different budgets for their projects, and we generally feel comfortable adjusting their budgets based on our willingness to pay. At the same time, we want to be mindful of not underfunding projects or leaving grantees with deals they would rather turn down but feel uncomfortable doing so due to grantmaker-grantee power imbalances.
“IMHO, as much effort should be spent increasing the productivity of donated fund spending as is spent marketing for increased donations.”
I think this is a good point. I estimate that we spend something like 100x more time evaluating grants and prioritising between them (which I see as trying to increase the productivity of donated funds) than fundraising. I expect we should actually spend more time fundraising.
Thank you for your response. I find this piece of the response interesting:
“we want to be mindful of not underfunding projects or leaving grantees with deals they would rather turn down but feel uncomfortable doing so due to grantmaker-grantee power imbalances.”
As someone on the grantee side of the equation (though I don’t apply to this particular fund), I would much prefer an under-funded response of 30% of initial budget request rather than a rejection which is effectively an offer of 0% of the request. But I have pretty thick skin.
I think what I am asking for is for as much communication and negotiation between the grantmaker and grantee as possible in adjusting the project offer to better match the resources request (demand) from the grantee to the resources supply available to the grantor. A better match and greater information exchange increases systemic supply/demand efficiency.
This also allows the grantee to design a better project ask the next time so that grant-giving productivity increases over time. 100% rejection can lead to disengagement. I think it is better for the EA movement if innovating project organizers who are proposing fundable projects can stay engaged.
Though I understand that grant evaluation time is a constraint, so there may not be resources for the extra grantor/grantee communication.
I also understand your point that fundraising is underprioritized. But given that donors may already be giving to other high-impact areas, this might be somewhat OK.
This should be pretty exciting and congrats on making this decison. ’m interested in the ways that this distancing can contribute to increasing a diversity of perspectives. How are you intending to ensure this happens?
edit:
I don’t think we have explicit plans to try and increase the diversity of perspectives represented in grantmaking.I don’t think we’ll be intentionally optimising against open Phil’s perspective, but I do think we’ll make some decisions differently to what they’d do when looking at similar grants.I think we’re really more about trying to figure out which grants seem most cost-effective according to the fund’s worldview (and ideally communicating that worldview to donors), as opposed to hedging our beliefs well over the community’s beliefs or something.
I do think the funds already represent various perspectives (e.g. there is a decent amount of disgreement on the ltff over which alignment agendas seem promising) and as I’m hiring more fund managers the make up of worldviews is something I’ll bear in mind (though I’m mostly looking for strong grantmaking skills and good epistemics).
Thanks Caleb for you reply! One wuick thing, if your not explicitly aiming to increase the diversity of perspectives, it feels a little (unintentionally) misleading to say as the first thing this could help with “Increase the diversity of perspectives in the funding ecosystem.” Whilst I know this is technically not the same as aiming to do that, the post strongly implies that you are aiming to do that (and it does seem you are in fact aiming to do that a bit!)
I think this is fair; I removed the first line from my comment above where I say we’re not explicitly trying to increase diversity. I think I misinterpreted your original question, and I wanted to clarify what I mean by increasing diversity of perspectives in grantmaking—I’ll try to give a proper answer to your original question later with more detail after thinking about it a bit more.
EAIF and LTFF will continue to struggle for individual donor funding until they can provide as good a value proposition as AWF and GHDF. As a longtime EA donor, my sticky impression of EAIF and LTFF is that it is a slush fund for OPP/EVF to dole out to their friends with questionable impact. Hopefully the separation will help change that impression and make better grants.
[EDIT: This impression may not be entirely accurate—it could be outdated, or I may be unfairly lumping the LTFF in with EAIF (I’m skeptical of longtermism and wish it was less prominent in EA). Regardless, the value proposition for individual donors is different from Open Phil/EVF, who seem happy and willing to pay tens of millions for fancy retreat centers. I’ve often cringed at EAIF grants given what the money could have done if in the AWF or GHDF fund instead.]
LTFF recently released a long payout report, which you might or might not find helpful to dive into. FWIW I think relatively few of our grants are contingent on philosophical longtermism, though many of them are probably only cost-effective if you think there’s non-trivial probability of large-scale AI and/or biorisk catastrophes in the next 20-100 years, in addition to other more specific worldviews that fund managers may have.
Thanks for linking to that! I appreciate the transparency in the write-up, and thanks for responding well to criticism. I don’t have the knowledge to evaluate the quality of the AI-related LTFF grants on AI. But I do have some experience in pandemic / aerosol disease transmission, and I’ve been pretty stunned by the lack of EA expertise in the space despite the attention. Others experts have told me they share the concern. It seems there is a strong bias in EA to source knowledge from “value-aligned” people that brand themselves as EAs, even if they aren’t the main experts in the field. That can result in a tendency to fund EA friends or friends-of-friends, or people they see as “value-aligned”, rather than proactively seeking out expertise. I’ve seen a few examples of it in EA funds and in other EA domains, but I don’t have a clear picture of how widespread the issue is. I also know EA funds doesn’t really have infrastructure set up to prevent such conflicts of interests. I don’t think the AWF and GHDF have as much of an issue because they have a much stronger evidence basis and therefore it is harder to argue funding friends is the most effective use of funds.
Thanks for the feedback! But this all sounds very generic and I don’t know how to interpret it. Can you give specific examples of pandemic/aerosol grantees we’ve funded but you think shouldn’t be funded, or (with their permission ofc) grants that we rejected that you think should be funded?
Happy to message or chat 1:1; I don’t want to dispute specific LTFF grants in the comment section.
DM’d, though I also think disputing specific LTFF grants in EA Forum comments is a time-honored tradition, see eg comments here.
I downvoted this comment because of the following section: “I’ve often cringed at EAIF grants”. I don’t think people should be using sentences like “I’m disgusted by your work” “I cringe at your work” in a cooperative environment. There are better ways to make very strong or even devastating criticisms.
If you want to share specific examples, you can find most of the EAIF grants here.
I think most grants to university, city, or national groups fund work that would have been done just as effectively by volunteers. Some of the university grants are particularly egregious, given how nearly all other college clubs exist just fine without paid organizers. “We are a student group interested in the most effective causes, and oh by the way funding us to organize this group for a semester is of similar levels of effectiveness as preventing 3 kids dying from malaria in poor countries.” I can think of few things more effective at turning people away from EA than college students learning the EA organizer is paid lots of money for it.
(I don’t work for the EAIF, and have limited visibility into their past decisionmaking)
Hmm, I think it’s fairly likely that the added value of having people devote significant time to university organizing (over what you could realistically get with volunteers) has higher EV via getting more future donations or via future hires than direct donations.
Do you disagree with this characterization of the expected consequences, or is the disagreement non-consequentialist in nature?
Separately, I also expect college club organizers to mostly be too young and relatively unknown entities for grantmakers that the “dole out to their friends” concern should be pretty minimal.
I think EAIF vastly overstates the effectiveness difference between paid vs. unpaid organizers, and dismisses the reputational risks of having paid organizers. Many college groups thrive without paid organizers, and EAIF-level of funding paid organizers only start being necessary once groups sizes reach 100. I don’t think there are any EA college groups that large, and they can fund-raise for it. I think the reputational harm—that EA is for self-serving grifters—causes far more damage than the marginal benefit from paid recruitment. It completely undercuts the message of using resources effectively.
The EAIF isn’t supporting university groups anymore (though I don’t think it’s implausible that we will start doing this again in the future).
I think we have a pretty good sense of which uni groups and activities tend to produce people that go on to do high-impact work. I don’t think that is the only metric on which we should assess uni groups, but it’s an important one. I do think that groups wth paid organisers tend to have more measurable impact than groups without (though ofc there are selection effects). The groups also generally seem larger and more productive.
I think the reputational harm effects that you pointed out exist, but I don’t think they are particularly large. My personal view is that people should be compensated for doing challenging work that produces large amounts of altruistic value and I think there is plenty of evidence to suggest that many EA groups do have a large positive impact e.g. the Rethink Priorities and Open Phil surveys.
EA Funds would like to do more retroactive investigation into the effectiveness of past grants, if you have ideas on which metrics would convince you that paid organizers are effective vs ineffective use of marginal resources, that’d be really appreciated! But of course there’s no expectation that you’d do our work for us either!
I don’t think I fully understand the reputational argument. The most naive interpretation of “It completely undercuts the message of using resources effectively” is that you’re simply assuming the conclusion. If the EV of having paid organizers is very low (or worse, negative), then of course this will be a hypocritical message to send to others. But if the EV is high (or at least higher than counterfactuals), then your actions are in line with your moral beliefs.
FWIW, I’m pretty sure EAIF organizers do, or at least did, believe their grants are cost-effective. But as you say, they might well be wrong,
Do you have good specific examples? Impressive college groups that lead to highly talented young people doing positively impactful projects would be great to emulate!
This recent post from Dave, a university EA group leader, is quite relevant.