″...despite reasonable claims that each area deserves to be prioritized over the others...”
I am saying that there are sets of reasonable moral views which would suggest each EA focus area is a near-complete priority. As (non-exhaustive) examples for each, person-affecting view + animals having moral weight → prioritize animal suffering reduction, while person-affecting view + species-based moral views → global health and welfare. (And even given moral uncertainty, the weight you assign to different moral views in, say, a moral parliament can lead to each. If you weight by current human moral views, you would likely arrive at global health and welfare, whereas if you equally weight by expressed preferences of living beings, shrimp welfare is probably dominant.)
I agree with your definition—highest priority according to each group would be about the marginal dollar allocation.
As an aside, I would include a note that portfolio allocation is a slightly different problem than marginal dollar allocation—if we have more money than the amount which can be invested in the top priority, we should invest in more than one thing. And if we are at all (morally) risk averse, or accept any of several versions of how to deal with moral uncertainty, there are benefits to diversification as well, so even for the marginal dollar, more than one thing should be prioritized, i.e. the next dollar should be split.
Continuing the aside: yes, you might split the marginal dollar because of uncertainty, like playing a mixed strategy. Alternatively, you might have strongly diminishing returns, so that you go all-in on one intervention for a certain amount of funding until the marginal EV drops below that of the next best intervention, at which point you switch to funding that one; this also results in diversification.
I am saying that there are sets of reasonable moral views which would suggest each EA focus area is a near-complete priority. As (non-exhaustive) examples for each, person-affecting view + animals having moral weight → prioritize animal suffering reduction, while person-affecting view + species-based moral views → global health and welfare. (And even given moral uncertainty, the weight you assign to different moral views in, say, a moral parliament can lead to each. If you weight by current human moral views, you would likely arrive at global health and welfare, whereas if you equally weight by expressed preferences of living beings, shrimp welfare is probably dominant.)
Sorry, I’m asking how you’re defining “prioritize”.
I agree with your definition—highest priority according to each group would be about the marginal dollar allocation.
As an aside, I would include a note that portfolio allocation is a slightly different problem than marginal dollar allocation—if we have more money than the amount which can be invested in the top priority, we should invest in more than one thing. And if we are at all (morally) risk averse, or accept any of several versions of how to deal with moral uncertainty, there are benefits to diversification as well, so even for the marginal dollar, more than one thing should be prioritized, i.e. the next dollar should be split.
Continuing the aside: yes, you might split the marginal dollar because of uncertainty, like playing a mixed strategy. Alternatively, you might have strongly diminishing returns, so that you go all-in on one intervention for a certain amount of funding until the marginal EV drops below that of the next best intervention, at which point you switch to funding that one; this also results in diversification.