I have thought about this, and I’m actually biting the bullet. I think that a lot of people get impact for a lot of things, and that even smallish projects depend on a lot of other moving parts, in the direction of You didn’t build that.
I don’t agree with some of your examples when taken literally, but I agree with the nuanced thing you’re pointing at with them, e.g., building good roads seems very valuable precisely because it helps other projects, if there is high nurse absenteeism then the nurses who show up take some of the impact...
I think that if you divide the thing’s impact by, say 10x, the ordering of the things according to impact remains, so this shouldn’t dissuade people from doing high impact things. The interesting thing is that some divisors will be greater than others, and thus the ordering will be changed. I claim that this says something interesting.
2.
Not really. If 10 people have already done it, your Shapley value will be positive if you take that bargain. If the thing hasn’t been done yet, you can’t convince 10 Shapley-optimizing altruists to do the thing for 0.5m each, but you might convince 10 counterfactual impact optimizers. As @casebach mentioned, this may have problems when dealing with uncertainty (for example: what if you’re pretty sure that someone is going to do it?).
3.
You’re right. The example, however, specified that the EAs were to be “otherwise idle”, to simplify calculations.
1.
I have thought about this, and I’m actually biting the bullet. I think that a lot of people get impact for a lot of things, and that even smallish projects depend on a lot of other moving parts, in the direction of You didn’t build that.
I don’t agree with some of your examples when taken literally, but I agree with the nuanced thing you’re pointing at with them, e.g., building good roads seems very valuable precisely because it helps other projects, if there is high nurse absenteeism then the nurses who show up take some of the impact...
I think that if you divide the thing’s impact by, say 10x, the ordering of the things according to impact remains, so this shouldn’t dissuade people from doing high impact things. The interesting thing is that some divisors will be greater than others, and thus the ordering will be changed. I claim that this says something interesting.
2.
Not really. If 10 people have already done it, your Shapley value will be positive if you take that bargain. If the thing hasn’t been done yet, you can’t convince 10 Shapley-optimizing altruists to do the thing for 0.5m each, but you might convince 10 counterfactual impact optimizers. As @casebach mentioned, this may have problems when dealing with uncertainty (for example: what if you’re pretty sure that someone is going to do it?).
3.
You’re right. The example, however, specified that the EAs were to be “otherwise idle”, to simplify calculations.