Looking at the EA Community Fund as an especially tractable example (due to the limited field of charities it could fund):
Since its launch in early 2017 it appears to have collected $289,968, and not regranted any of it until a $83k grant to EA Sweden currently in progress. I am basing this on
https://app.effectivealtruism.org/funds/ea-community—it may not be precisely right.
On the one hand, it’s good that some money is being disbursed. On the other hand the only info we have is https://app.effectivealtruism.org/funds/ea-community/payouts/1EjFHdfk3GmIeIaqquWgQI . All we’re told about the idea and why it was funded is that it’s an “EA community building organization in Sweden” and Will McAskill recommended Nick Beckstead fund it “on the basis of (i) Markus’s track record in EA community building at Cambridge and in Sweden and (ii) a conversation he had with Markus.” Putting it piquantly (and over-strongly I’m sure, for effect), this sounds concerningly like an old boy’s network: Markus > Will > Nick. (For those who don’t know, Will and Nick were both involved in creating CEA.) It might not be, but the paucity of information doesn’t let us reassure ourselves that it’s not.
With $200k still unallocated, one would hope that the larger and more reputable EA movement building projects out there would have been funded, or we could at least see that they’ve been diligently considered. I may be leaving some out, but these would at least include the non-CEA movement building charities: EA Foundation (for their EA outreach projects), Rethink Charity and EA London. As best as I could get an answer from Rethink Charity at http://effective-altruism.com/ea/1ld/announcing_rethink_priorities/dir?context=3 this is not true in their case at least.
Meanwhile these charities can’t make their case direct to movement building donors whose money has gone to the fund since its creation.
This is concerning, and sounds like it may have done harm.
Note: EA is totally a trust network—I don’t think the funds are trying to be anything like GiveWell, who you’re supposed to trust based on the publicly-verifiable rigour of their research. EA funds is much more toward the side of the spectrum of “have you personally seen CEA make good decisions in this area” or “do you specifically trust one of the re-granters”. Which is fine, trust is how tightly-knit teams and communities often get made. But if you gave to it thinking “this will look like if I give to Oxfam, and will have the same accountability structure” then you’ll correctly be surprised to find out it works significantly via personal connections.
The same way you’d only fund a startup if you knew them and how they worked, you should probably only fund EA funds for similar reasons—and if the startup tried to make its business plan such that anyone would have reason to fund it, the business plan probably wouldn’t be very good. I think that EA should continue to be a trust-based network, and so on the margin I guess people should give less to EA funds rather than EA funds make grants that are more defensible.
On this very website, clicking the link “New to Effective Altruism?” and a little browsing quickly leads to recommendations to give to EA funds. If EA funds really is intended to be a high-trust option, CEA should change that recommendation.
Yup. I suppose I wrote down my assessment of the information available about the funds and the sort of things that would cause me to donate to it, not the marketing used to advertise it—which does indeed feel disconnected. It seems that there’s a confusing attempt to make this seem reasonable to everyone whilst in fact not offering the sort of evidence that should make it so.
The evidence about it is not the ‘evidence-backed charities’ that made GiveWell famous/trustworthy, but is “here is a high status person in a related field that has a strong connection to EA”, which seems not that different from the way other communities ask their members to give funding—it’s based on trust in the leaders in the community, not on objectively verifiable metrics to outsiders. So you should ask yourself what causes you to trust CEA and then use that, as opposed to the objective metrics associated with the EA funds (which there are far fewer of than with GiveWell). For example if CEA has generally made good philosophical progress in this area and also made good hiring decisions, that would make you trust the grant managers more.
I don’t think the funds are trying to be anything like GiveWell
I disagree – the funds are definitely positioned as more risky than e.g. GiveWell, but nothing like as risky as you’re making out. Take the Global Health and Development Fund:
Why might you choose not to donate to this fund?
Donors with a very low risk tolerance may choose to avoid this fund because the fund is empowered to take risks at the organizational level by funding unproven, but promising new charities.
Reading that, I would come away thinking that donations from that fund ought to be rigorous in a (mostly) publicly-verifiable way (more than just a conversation someone had with someone).
This strikes me as making a false dichotomy between ‘trust the grant making because lots of information is made public about its decisions’ and ‘trust the grant making because you personally know the re-granter (or know someone who knows someone etc)’. I would expect this is instead supposed to work in the way a lot of for profit funds presumably work: you trust your money to a particular fund manager because they have a strong history of their funds making money. You don’t need to know Elie personally (or know about how he works / makes decisions) to know his track record of setting up GW and thereby finding excellent giving opportunities.
Just noticed this, so sorry for the late reply! I (through EA Sweden) was the recipients of an EA Community Fund grant a few months back. I’ll just say a few things about the grant, some general thoughts about the EA Community Fund and am happy to answer any questions you might have!
I’d say that there was a bit more information to go on than simply Will having seen me do EA group organizing over the past 5 or so years: I also provided a project proposal. However, I’d agree with the impression that the main factor in making the grant was about trust and first-hand knowledge of my work in the past.
Abstracting from my particular situation, there currently seem to be growing pains in the EA community building-space. My impression is that the bottleneck is not good projects to fund, but rather ability to consider proposals and allocate funds. I think making funding decisions in the community building-space based largely on trust and proven track record is a good heuristic. However, it won’t be particularly scalable and so needs to be supplemented by more time-intensive methods.
Given the small size of the EA Community Fund, it seems unreasonable for Nick Beckstead to be managing it. Once CEA is able to allocate their EA Community Building Grants effectively, I’d recommend the EA Community Fund being allocated by CEA rather than Nick Beckstead.
Looking at the EA Community Fund as an especially tractable example (due to the limited field of charities it could fund):
Since its launch in early 2017 it appears to have collected $289,968, and not regranted any of it until a $83k grant to EA Sweden currently in progress. I am basing this on https://app.effectivealtruism.org/funds/ea-community—it may not be precisely right.
On the one hand, it’s good that some money is being disbursed. On the other hand the only info we have is https://app.effectivealtruism.org/funds/ea-community/payouts/1EjFHdfk3GmIeIaqquWgQI . All we’re told about the idea and why it was funded is that it’s an “EA community building organization in Sweden” and Will McAskill recommended Nick Beckstead fund it “on the basis of (i) Markus’s track record in EA community building at Cambridge and in Sweden and (ii) a conversation he had with Markus.” Putting it piquantly (and over-strongly I’m sure, for effect), this sounds concerningly like an old boy’s network: Markus > Will > Nick. (For those who don’t know, Will and Nick were both involved in creating CEA.) It might not be, but the paucity of information doesn’t let us reassure ourselves that it’s not.
With $200k still unallocated, one would hope that the larger and more reputable EA movement building projects out there would have been funded, or we could at least see that they’ve been diligently considered. I may be leaving some out, but these would at least include the non-CEA movement building charities: EA Foundation (for their EA outreach projects), Rethink Charity and EA London. As best as I could get an answer from Rethink Charity at http://effective-altruism.com/ea/1ld/announcing_rethink_priorities/dir?context=3 this is not true in their case at least.
Meanwhile these charities can’t make their case direct to movement building donors whose money has gone to the fund since its creation.
This is concerning, and sounds like it may have done harm.
Note: EA is totally a trust network—I don’t think the funds are trying to be anything like GiveWell, who you’re supposed to trust based on the publicly-verifiable rigour of their research. EA funds is much more toward the side of the spectrum of “have you personally seen CEA make good decisions in this area” or “do you specifically trust one of the re-granters”. Which is fine, trust is how tightly-knit teams and communities often get made. But if you gave to it thinking “this will look like if I give to Oxfam, and will have the same accountability structure” then you’ll correctly be surprised to find out it works significantly via personal connections.
The same way you’d only fund a startup if you knew them and how they worked, you should probably only fund EA funds for similar reasons—and if the startup tried to make its business plan such that anyone would have reason to fund it, the business plan probably wouldn’t be very good. I think that EA should continue to be a trust-based network, and so on the margin I guess people should give less to EA funds rather than EA funds make grants that are more defensible.
On this very website, clicking the link “New to Effective Altruism?” and a little browsing quickly leads to recommendations to give to EA funds. If EA funds really is intended to be a high-trust option, CEA should change that recommendation.
Yup. I suppose I wrote down my assessment of the information available about the funds and the sort of things that would cause me to donate to it, not the marketing used to advertise it—which does indeed feel disconnected. It seems that there’s a confusing attempt to make this seem reasonable to everyone whilst in fact not offering the sort of evidence that should make it so.
The evidence about it is not the ‘evidence-backed charities’ that made GiveWell famous/trustworthy, but is “here is a high status person in a related field that has a strong connection to EA”, which seems not that different from the way other communities ask their members to give funding—it’s based on trust in the leaders in the community, not on objectively verifiable metrics to outsiders. So you should ask yourself what causes you to trust CEA and then use that, as opposed to the objective metrics associated with the EA funds (which there are far fewer of than with GiveWell). For example if CEA has generally made good philosophical progress in this area and also made good hiring decisions, that would make you trust the grant managers more.
I disagree – the funds are definitely positioned as more risky than e.g. GiveWell, but nothing like as risky as you’re making out. Take the Global Health and Development Fund:
Reading that, I would come away thinking that donations from that fund ought to be rigorous in a (mostly) publicly-verifiable way (more than just a conversation someone had with someone).
This strikes me as making a false dichotomy between ‘trust the grant making because lots of information is made public about its decisions’ and ‘trust the grant making because you personally know the re-granter (or know someone who knows someone etc)’. I would expect this is instead supposed to work in the way a lot of for profit funds presumably work: you trust your money to a particular fund manager because they have a strong history of their funds making money. You don’t need to know Elie personally (or know about how he works / makes decisions) to know his track record of setting up GW and thereby finding excellent giving opportunities.
For information. EA London has neither been funded by the EA Community Fund nor diligently considered for funding by the EA Community Fund.
In December EA London was told that the EA Community Fund was not directly funding local groups as CEA would be doing that. (This seem to be happening, see: http://effective-altruism.com/ea/1l3/announcing_effective_altruism_community_building/)
Just noticed this, so sorry for the late reply! I (through EA Sweden) was the recipients of an EA Community Fund grant a few months back. I’ll just say a few things about the grant, some general thoughts about the EA Community Fund and am happy to answer any questions you might have!
I’d say that there was a bit more information to go on than simply Will having seen me do EA group organizing over the past 5 or so years: I also provided a project proposal. However, I’d agree with the impression that the main factor in making the grant was about trust and first-hand knowledge of my work in the past.
If you wanna know more about what we’ve been up to, you can read our plans for the year (as of February) here: http://effective-altruism.com/ea/1kf/effective_altruism_sweden_plans_for_2018/
Abstracting from my particular situation, there currently seem to be growing pains in the EA community building-space. My impression is that the bottleneck is not good projects to fund, but rather ability to consider proposals and allocate funds. I think making funding decisions in the community building-space based largely on trust and proven track record is a good heuristic. However, it won’t be particularly scalable and so needs to be supplemented by more time-intensive methods.
Given the small size of the EA Community Fund, it seems unreasonable for Nick Beckstead to be managing it. Once CEA is able to allocate their EA Community Building Grants effectively, I’d recommend the EA Community Fund being allocated by CEA rather than Nick Beckstead.