Public reports are now optional for EA Funds grantees
Public reports are now explicitly optional for applicants to EA Funds. We have updated our application form and other outwardly-facing materials to reflect this change.
If you are an individual applicant or a new organization, choosing not to have a public report will very rarely affect the chance that we fund you (and we will reach out to anyone for whom it would make a substantial difference).
If you are an established organization, choosing not to have a public report may slightly decrease the chance that we fund you. We are generally happy to omit mentions of individuals from public grant reports of organizations at their request.
If we are uncomfortable making a grant privately with EA Funds money, we may ask to forward your application to private donors we are connected to, or to other large funders in the space.
Broadly, we think there are many valid reasons not to want a public report, and we don’t want anyone to be discouraged from applying to us for funding. If you or someone you know could use funding productively but was previously discouraged by our payout reports, please apply or encourage them to apply.
- Reflections on my time on the Long-Term Future Fund by 2 Aug 2023 1:32 UTC; 179 points) (
- 2021 AI Alignment Literature Review and Charity Comparison by 23 Dec 2021 14:06 UTC; 176 points) (
- 2021 AI Alignment Literature Review and Charity Comparison by 23 Dec 2021 14:06 UTC; 168 points) (LessWrong;
- Where are you donating this year, and why? (Open thread) by 23 Nov 2022 12:26 UTC; 151 points) (
- Long-Term Future Fund: April 2023 grant recommendations by 2 Aug 2023 1:31 UTC; 107 points) (
- Effective Altruism Infrastructure Fund: March 2024 recommendations by 27 May 2024 21:11 UTC; 88 points) (
- Long-Term Future Fund: April 2023 grant recommendations by 2 Aug 2023 7:54 UTC; 81 points) (LessWrong;
- Long-Term Future Fund: July 2021 grant recommendations by 18 Jan 2022 8:49 UTC; 75 points) (
- Long-Term Future Fund: May 2023 to March 2024 Payout recommendations by 12 Jun 2024 13:46 UTC; 75 points) (
- Long-Term Future Fund: December 2021 grant recommendations by 18 Aug 2022 20:50 UTC; 68 points) (
- Long-Term Future Fund: May 2023 to March 2024 Payout recommendations by 12 Jun 2024 13:46 UTC; 40 points) (LessWrong;
- EA Funds—A simple analysis of grants by 9 Feb 2024 21:13 UTC; 22 points) (
- [AN #170]: Analyzing the argument for risk from power-seeking AI by 8 Dec 2021 18:10 UTC; 21 points) (LessWrong;
- 20 Dec 2022 23:24 UTC; 13 points) 's comment on Bad Omens in current EA Governance by (
- 22 Dec 2022 18:12 UTC; 2 points) 's comment on Against philanthropic diversification by (
I find the lack of transparency that comes along with this problematic. Is there a place where I can read about the reasons for the chance or can you elaborate here so that I can understand the trade-off?
We got feedback from several people that they weren’t applying to the funds because they didn’t want to have a public report. There are lots of reasons that I sympathize with for not wanting a public report, especially as an individual (e.g. you’re worried about it affecting future job prospects, you’re asking for money for mental health support and don’t want that to be widely known, etc.). My vision (at least for the Long-Term Future Fund) is to become a good default funding source for individuals and new organizations, and I think that vision is compromised if some people don’t want to apply for publicity reasons.
Broadly, I think the benefits to funding more people outweigh the costs to transparency.
Thanks for the response.
Is there a way to make things pseudo-anonymous, revealing the type of grants being made privately but preserving the anonymity of the grant recipient? It seems like that preserves a lot of the value of what you want to protect without much downside.
For example, I’d be personally very skeptical that giving grants for personal mental support would be the best way to improve the long-term future and would make me less likely to support the LTFF and if all such grants weren’t public, I wouldn’t know that. There might also be people for whom the opposite is true and they wouldn’t donate to LTFF because they didn’t know such grants were being made. If you said e.g. “$100K was given out last quarter to support the mental health of individuals” we’d get what we want, but we’d still have no idea who the recipients are.
I like this idea; I’ll think about it and discuss with others. I think I want grantees to be able to preserve as much privacy as they want (including not being listed in even really broad pseudo-anonymous classifications), but I’m guessing most would be happy to opt-in to something like this.
(We’ve done anonymous grant reports before but I think they were still more detailed than people would like.)
Any updates here? I share Devon’s concern: this news also makes me less likely to want to donate via EA Funds. At worst, the fear would be this: so much transparency is lost that donations go into mysterious black holes rather than funding effective organizations. What steps can be taken to convince donors that that’s not what’s happening?
(I am the new interim project lead for EA funds and will be running EA funds going forward.)
I completely understand that you want to know that your donations are used in a way that you think is good for the world. We refer private grants to private funders so that you know that your money is not being used for projects that you get little or no visibility on.
I think that EA Funds is mostly for donors that are happy to lean on the judgment of our fund managers. Sometimes our fund managers may well fund things like mental health support if they think is one of the best ways to improve the world. LTFF and EAIF in particular, fund a variety of projects that are often unusual. If you don’t trust the judgment of our fund managers or don’t agree with the scope of our funds there are probably donation opportunities that might be a better fit for you than EA Funds.
We try hard to optimise the service for the grantees and this means that we may fall short of building the best service for our donors. We are exploring more donor-focused products with GWWC, that we will hopefully be able to offer soon.
GWWC’s effective charity recommendations page states, “For most people, we recommend donating through a reputable fund that’s focused on effectiveness.” There follows a list of 8 funds, of which the first 4 are EA funds.
If your view as EA funds lead is that EA funds are only suitable for donors who personally trust the judgment of your fund managers, then something seems to have gone wrong with the messaging, because “most people” won’t be in a position to form a view on that.
I also note that none of the funds list under “Why you might choose not to donate to this Fund” that the fund may not account for its donations, which I suspect (as your comment implies) would be a highly material factor to at least some donors. The EA Infrastructure Fund does indicate that a potential donor might not donate if they have concerns about grantmaker independence, but that’s not quite the same point, and there’s no similar warning for the other funds.
The difficulty here is that you understand EA Funds as existing for a narrow set of donors (those who are in a position to assess the trustworthiness of individual fund managers). That may well be a sensible thing to exist, but the funds are being marketed as suitable for a much wider class of donors (“most people”).
I agree with the last paragraph above, and want to point out that the funds, including LTFF, are still being marketed this way at GWWC: https://web.archive.org/web/20240409182610/https://www.givingwhatwecan.org/donate/organizations
Huh, there does seem to be a communication mismatch. Though I do think the Animal Welfare fund and the Global Health fund are more legible than LTFF.
Now that the whole FTX thing has happened, have you reconsidered your position about the trust the public should have in organizations that don’t share the distribution of funds?
+1. I always assumed that the ‘Open’ in ‘Open Philanthropy’ referred to an aspiration for a greater degree of transparency than is typically seen in philanthropy, and I generally support this aspiration being shared in the wider effective altruism philanthropic space. The EA Funds are an amazingly flexible way of funding extremely valuable work – but it seems to me that this flexibility would still benefit from the scrutiny and crowd-input that becomes possible through measures like public reports.
So after FTX, how do you all respond to this? Can you maybe see more why we are hesitant to donate to funds that don’t say how they allocate the money?
This is extremely troubling. Why in the world would I want to donate to some mystery fund? How can we ensure the money isn’t going somewhere we find highly objectionable?
See earlier discussion here.
(My own takes, speaking only for myself)
Probably you shouldn’t donate to us if you aren’t comfortable with trusting our judgements (of which some of the reasoning won’t necessarily be public)!
Other options include a) donating to fund projects within your network that you think are unusually good, b) donating to established organizations that you trust, and c) the newly set-up Longtermism Fund.
I’m fine with trusting your judgement if I can verify what you decided. This provides a mechanism for the donators to express displeasure with your financial decisions.
But this has turned this entire fund into a black box. For all we know you could be giving money to charities run by your friends. Throwing money into a mystery black box is so far from the values of effective altruism.
How can you honestly tell people that the most effective way to donate their money is to give it to you and totally trust that you will do a good job with it.
Thanks for sharing your concern!
The vast majority of projects do not opt-out of public reporting and as a charity, our trustees do have oversight over large grants that we make.
As Linch said, I do think that this change to our requirements does require you to place some more trust in our grantmakers but I still think, due to the sensitive nature of some of our grants, this is the right call.
Why don’t you just add an option for people to donate funds to public reporting only projects?
I apologize if I’m coming off rude, but I think the reason this has me particularly peeved is that this isn’t just some normal charity, this is one of the major charities behind the movement.
If it turns out that you mismanaged this fund, you are going to tarnish the entire effective altruism movement. This is the type of thing that gets an episode on John Oliver if you mess up.
I think that this would probably be fully funged by other donors as we have a very small number of grants that aren’t publicly reported and a relatively small proportion of donors provide the majority of our funding.
That said GWWC now manages the donations side of funds and I can request they add this feature if I see more demand for it (it will create some operational overhead on our side).
I don’t think that argument makes sense. The more money one donates, the more funds will likely be distributed to causes of all types, including private ones. While you can’t be certain your specific donation caused an increase in funding to private causes, you can be relatively confident that on average such donations will do so.
This is just like how you can’t guarantee buying meat will kill more animals, but you can be relatively confident it will do so on average.
I’m afraid I find this approach troubling. “Don’t you trust me?” is the question of the conman. It’s not necessary, or desirable, to publish your full reasoning, but it is necessary to properly account for funds which you are ultimately holding on trust. Apart from anything else, the reputational damage if someone were to misappropriate donors’ money would be significant.
I think there’s an expectations mismatch here. I think the latest public payout report might be helpful for expectations setting.
Just FYI, I personally don’t donate to LTFF, so while I have some general concern that charitable funds should be spent in an accountable way and that GWWC donors should feel comfortable with how their money is being spent, my personal concern is with GHDF. If the issue of public reports being optional applies only or mostly to LTFF and EAIF, perhaps that could be clarified?
Luke Freeman did recently email me offering to chat, so I guess I could ask him. I think I can personally solve the problem by redirecting my donations to GiveWell, but I can’t be the only person who’s troubled by this.
I don’t really know much about GHDF, sorry. As far as I can tell, it doesn’t seem very active and probably not meaningfully different from donating to GiveWell (the fund managers are literally all Givewell staff!)
Well, I think in the past the managers might use it to fund things that they thought were good but didn’t fit with the main GiveWell recommendations, but now that GiveWell have the “All Funds” option I’m not sure what would differentiate that from GHDF; it may be that it’s just a presentational difference. I’m 85% sure that the GHDF grants are exactly those which are displayed in the GiveWell spreadsheet as having been made via GHDF, but I’m just slightly worried by the notice that payout reports are optional.
I totally agree this is really scary. I know you are getting bulk downvoted but I think the average person concerned about the future the planet would be worried that such a large organization is effectively siphoning money into mystery projects while claiming to be a charity.
I didn’t downvote, but I was rather hurt by the implication that my work was con artistry. I spent some time trying to defend myself against the accusation, but realized pretty quickly that it was in practice basically impossible without pretty sophisticated audits and also a bunch of information that I in no way had access to.
I don’t for a moment think that you are a con artist. I suspect that (a) the amounts involved are all small, (b) you genuinely believe that all the grants are effective, (c) mostly you’re right but (d) occasionally, in ordinary human frailty, your judgment errs. If that’s right, then no real harm is done, but I have no way of verifying any of that, because you don’t (as far as I can see) disclose any information at all about the unreported grants.
I have to say also that you sound like you’re saying that you refuse to comply with the law, but as far as I can see, Effective Ventures does in fact comply with the law and publishes a list of its grantees (save for individuals and those receiving grants of less than £25k) within its Trustees’ Report. But that seems to create a different problem, because the post above ought to make clear that organisations seeking grants in excess of £25k will not be able to remain anonymous, because the grant will be dislosed in the annual accounts (although I believe there is a “serious prejudice” exception).
Thanks for the reply!
Thanks, but maybe you’re being too generous here! :) I don’t think you should have 0% probability on this; I just think you should basically be at around base rates.
Yep this sounds right. That’s why in the parent thread, I was suggesting other more legible places to donate to, for people who care a lot about this.
I’m not sure what you’re referring to. I apologize for any miscommunication. I don’t really handle operational details and I could be wrong about a bunch of stuff.
I think we typically refer those grants to private funders. This may entail more uncertainty and delays and I apologize as a result.
(all views my own, not my employers’)
I don’t know about how this aspect of law works, but does the Trustees’s report actually contain all the grants? Based on the May 2021 LTFF report, I would expect to see a e.g. significant grant made to the Cambridge Computer Science Department (or similar), but unless I am misreading, or it is labelled counter-intuitively, I don’t see it.
More importantly, I would expect almost-all of the secretive grants to be made to individuals, which sounds like they are excluded from the reporting anyway.
Doesn’t the fact that the information is not sufficiently available to prove that the organization is spending its money in legit ways disturbing to you?
No, I have many more important things to worry about, including but not limited to making good grants.
I assume but have not verified that Effective Ventures and our large funders have access to good auditors.
At the risk of saying the obvious, to be able to concretely demonstrate whether money is not “effectively siphoned”, you need to de-anonymize not just the grantees but (much more importantly) all the donors. This is not something most charities do publicly, and AFAICT is typically handled the normal way through having good accountants, auditors, a legal system, etc.
We are already much more public than the vast majority of institutions (for-profit or non-profit). I don’t think “every person who works part-time in a foundation needs to be able to trace exactly where every dollar comes from or goes every time some person on the internet asks for this” is a reasonable bar for “lower than baseline probability of being a con artist.”
Taking a step back, I think if this is something that you’re very concerned about, it’d be interesting to plan out how to investigate EA charities for fraud. I’m not sure how valuable this work will be, but at least it’s plausibly the type of thing that has a reasonably high EV. I assume a good first step is to talk to a representative sample of really good auditors.
Put another way, the thing that matters to me is that we actually do good in the world. This is where the bulk of where moral responsibilities lie. As I’ve said before (on a different topic in the grantmaking context):
I’m sure you really care a lot about this, and I’m sure a bunch of random people online implying you might be part of something shady is upsetting to you. I have no doubt you are doing your best to help the world, which is incredible.
But do you see how after things like FTX people might be hesitant to donate to funds that don’t disclose where the money goes? I understand the motives to make this decision were probably good, but there has to be a better way.