One possibility is to encourage epistemic modesty for beliefs about which are the top charities, relative to cohorts who would make the same decision as you at least once.
Suppose I am wondering about donating to GWWC, but am not sure if I will just displace another donor. From an outside view, it is not obvious which of us would make best alternative use of the money (presumably we will both look for valuable giving opportunities). I think people often don’t fully take this into account, and assume that holding the money themselves may be much better. But if you think the value is comparable (even if you think your judgement might on average be a little better) it could well be good to donate as soon as the opportunity arises, in order to increase the efficiency of the entire process.
That’s a good point, and it might be plausible with regards to both charities and causes. Thinking through it a little, if...
donor X has decided that GWWC (or whatever) is the best charity in the world before considering displacement effects
and—we’d presumably want to add—these displacement effects don’t appear to be unusually significant (e.g. displacing money which then goes to finish creating an unfriendly AI, or the Gates Foundation’s small and most favoured pet projects having a large pool of potential funding available anyway and there marginal projects being clearly eccentric)
...then this view would suggest that donor X just gives to that charity, and let displaced donors to give to what they initially thought was second best. The challenge would be that this is unconsequentialist. A potential counter could explore the good consequences of a diverse donor market in which everyone gives to what they think best, though it might be harder to use this counter within the small EA market as opposed to the general philanthropic one. I haven’t really thought this through, just getting some undigested considerations down (without care or concern for karma) as I come up with them in the hope that I can free ride on someone else doing that thinking through ;)
One possibility is to encourage epistemic modesty for beliefs about which are the top charities, relative to cohorts who would make the same decision as you at least once.
Suppose I am wondering about donating to GWWC, but am not sure if I will just displace another donor. From an outside view, it is not obvious which of us would make best alternative use of the money (presumably we will both look for valuable giving opportunities). I think people often don’t fully take this into account, and assume that holding the money themselves may be much better. But if you think the value is comparable (even if you think your judgement might on average be a little better) it could well be good to donate as soon as the opportunity arises, in order to increase the efficiency of the entire process.
That’s a good point, and it might be plausible with regards to both charities and causes. Thinking through it a little, if...
donor X has decided that GWWC (or whatever) is the best charity in the world before considering displacement effects
and—we’d presumably want to add—these displacement effects don’t appear to be unusually significant (e.g. displacing money which then goes to finish creating an unfriendly AI, or the Gates Foundation’s small and most favoured pet projects having a large pool of potential funding available anyway and there marginal projects being clearly eccentric)
...then this view would suggest that donor X just gives to that charity, and let displaced donors to give to what they initially thought was second best. The challenge would be that this is unconsequentialist. A potential counter could explore the good consequences of a diverse donor market in which everyone gives to what they think best, though it might be harder to use this counter within the small EA market as opposed to the general philanthropic one. I haven’t really thought this through, just getting some undigested considerations down (without care or concern for karma) as I come up with them in the hope that I can free ride on someone else doing that thinking through ;)