… but people are aware of the problem and tackle it (research into the probabilities of various existential risks, looking for particularly neglected existential risks such as AI risk). I haven’t seen anything similar for meta organizations.
I’m closest to the EA Foundation and know that their strategy rests to a great part on focusing on hard-to-quantify high risk–high return projects because these are likely to be neglected. I don’t know if other metaorganizations are doing something similar, but it is possible.
Imagine that Alice will now have an additional $2,000 of impact, and each organization spent $1,000 to accomplish this. Then each organization would (correctly) claim a leverage ratio of 2:1, but the aggregate outcome is that we spent $5,000 to get $2,000 of benefit, which is clearly suboptimal. These numbers are completely made up for pedagogical purposes and not meant to be actual estimates. In reality, even in this scenario I suspect that the ratio would be better than 1:1, though it would be smaller than the ratio each organization would compute for itself.
Yes. Good point and another reason fund ratios are silly (and possibly toxic). The other one is this one. I’ve written an article on a dangerous phenomenon that has been limiting the work in some cause areas that is also related to this attribution problem.
I’m closest to the EA Foundation and know that their strategy rests to a great part on focusing on hard-to-quantify high risk–high return projects because these are likely to be neglected. I don’t know if other metaorganizations are doing something similar, but it is possible.
Huh, interesting. I don’t know much about the EA Foundation, but my impression is that this is not the case for other meta orgs.
The other one is this one.
Yeah, I forgot about evaluating from a growth perspective, despite reading and appreciating that article before. Whoops.
I’m closest to the EA Foundation and know that their strategy rests to a great part on focusing on hard-to-quantify high risk–high return projects because these are likely to be neglected. I don’t know if other metaorganizations are doing something similar, but it is possible.
Yes. Good point and another reason fund ratios are silly (and possibly toxic). The other one is this one. I’ve written an article on a dangerous phenomenon that has been limiting the work in some cause areas that is also related to this attribution problem.
Huh, interesting. I don’t know much about the EA Foundation, but my impression is that this is not the case for other meta orgs.
Yeah, I forgot about evaluating from a growth perspective, despite reading and appreciating that article before. Whoops.