The “what causes should CEA represent?” issue seems especially tricky because the current canonical EA cause areas have very different metrics underpinning them.
Global development & animal welfare usually use GiveWell-style cost-effectiveness analysis to determine what’s effective.
X-risk usually uses theoretical argument & back-of-the-envelope estimates to determine effectiveness.
I’m not sure what movement building uses – probably theory and back-of-the-envelope as well?
Anyway, point is that there’s not a meta-metric that the current cause areas use to compare against each other.
So when considering a new cause area, should we use the x-risk standard of effectiveness? Or the global development one? (rhetorical)
Seems tricky – I’m glad CEA is thinking about this.
Sure, but I don’t think that framework gives a decision procedure for what buckets are worth considering. (Haven’t read it closely recently, so maybe I missed this.)
For example, I’m pretty sure a Christian who’s interested in EA principles wouldn’t be able to convince EA decision-makers that a Christian missionary intervention was effective, even if it was very cost-effective & had a track record of success.
The Christian wouldn’t be able to make the case for their missionary intervention because “spreading the word of God” isn’t a goal that EA considers worthwhile. As far as I know, EA doesn’t have a strong case for why this kind of thing isn’t worthwhile, it’s just one of the “deep judgment calls” that Holden talks about in that post.
Not caring about Christian missionary work is in cultural DNA of EA. It’s not a particularly justified position, rather it’s an artifact of the worldview assumptions that a quorum of EAs brought to the community at a certain point in time.
(To be super-duper clear, I’m not advocating for Christian interventions to be included in EA; it’s just an illustrative example.)
The “what causes should CEA represent?” issue seems especially tricky because the current canonical EA cause areas have very different metrics underpinning them.
Global development & animal welfare usually use GiveWell-style cost-effectiveness analysis to determine what’s effective.
X-risk usually uses theoretical argument & back-of-the-envelope estimates to determine effectiveness.
I’m not sure what movement building uses – probably theory and back-of-the-envelope as well?
Anyway, point is that there’s not a meta-metric that the current cause areas use to compare against each other.
So when considering a new cause area, should we use the x-risk standard of effectiveness? Or the global development one? (rhetorical)
Seems tricky – I’m glad CEA is thinking about this.
I really like the Open Philanthropy Project’s way of thinking about this problem:
https://www.openphilanthropy.org/blog/update-cause-prioritization-open-philanthropy
The short version (in my understanding):
Split assumptions about the world/target metrics into distinct “buckets”.
Do allocation as a two step process: intra-bucket on that bucket’s metric, and inter-bucket separately using other sorts of heuristics.
(If you like watching videos rather than reading blog posts, Holden also discussed this approach in his fireside chat at EAG 2018: San Francisco.)
Sure, but I don’t think that framework gives a decision procedure for what buckets are worth considering. (Haven’t read it closely recently, so maybe I missed this.)
For example, I’m pretty sure a Christian who’s interested in EA principles wouldn’t be able to convince EA decision-makers that a Christian missionary intervention was effective, even if it was very cost-effective & had a track record of success.
The Christian wouldn’t be able to make the case for their missionary intervention because “spreading the word of God” isn’t a goal that EA considers worthwhile. As far as I know, EA doesn’t have a strong case for why this kind of thing isn’t worthwhile, it’s just one of the “deep judgment calls” that Holden talks about in that post.
Not caring about Christian missionary work is in cultural DNA of EA. It’s not a particularly justified position, rather it’s an artifact of the worldview assumptions that a quorum of EAs brought to the community at a certain point in time.
(To be super-duper clear, I’m not advocating for Christian interventions to be included in EA; it’s just an illustrative example.)