This is a neat approach, Rob, and some form of it seems likely to be one of the best ways of thinking about this. I think the emphasis on putting yourself in the shoes of those you’re trying to help rather than acting for yourself is particularly valuable. I think there is one extra difficulty that you haven’t mentioned, though, which is to do with people having other preferences than yours.
Even if I’m able to work out that, given a random chance of being one of the participants I would prefer 2 to 3, it doesn’t necessarily follow that 2 is preferable to 3 in an objective sense. It is interesting to imagine what the participants themselves would choose behind your veil (if they were fully informed about the tradeoffs etc.).
In many cases, one finds that people tend to think that their own condition is less bad than people who don’t have the condition do. (That is, if you ask sighted people how bad it would be to be blind they say it would be much worse than blind people do when asked.) This suggests that, behind a veil of ignorance where self-interest is not at play, those at risk of malaria but not worms might regard treating worms as most important and those at risk of worms but not malaria would treat malaria. It seems hard to know whom to prioritise then.
There’s also the eternal problem with imagining what one would choose—people often choose poorly. I assume you’re making some sort of assumptions choosing under the best possible conditions. It may be, though, that your values depend on your decision-making conditions.
Of course, you still have to choose and like you say it’s clear that 2 and 3 are both preferable to 1. I think this tool will get you answers most of the time, and can focus your mind on important questions, but there’s a intrinsic uncertainty (or maybe indeterminateness) about the ordering.
1) use their preferences and experiences (pretend you don’t know what you personally want)
2) imagine you knew everything you could about the impacts.
Which I think is considered the standard approach when thinking behind a veil.
As you say, you might find it hard to do 1) properly, but I think that effect is small in the scheme of things. It’s also better than not trying at all!
“This suggests that, behind a veil of ignorance where self-interest is not at play, those at risk of malaria but not worms might regard treating worms as most important and those at risk of worms but not malaria would treat malaria.”
Wouldn’t they then cancel out if you took the average of the two when deciding?
This is a neat approach, Rob, and some form of it seems likely to be one of the best ways of thinking about this. I think the emphasis on putting yourself in the shoes of those you’re trying to help rather than acting for yourself is particularly valuable. I think there is one extra difficulty that you haven’t mentioned, though, which is to do with people having other preferences than yours.
Even if I’m able to work out that, given a random chance of being one of the participants I would prefer 2 to 3, it doesn’t necessarily follow that 2 is preferable to 3 in an objective sense. It is interesting to imagine what the participants themselves would choose behind your veil (if they were fully informed about the tradeoffs etc.).
In many cases, one finds that people tend to think that their own condition is less bad than people who don’t have the condition do. (That is, if you ask sighted people how bad it would be to be blind they say it would be much worse than blind people do when asked.) This suggests that, behind a veil of ignorance where self-interest is not at play, those at risk of malaria but not worms might regard treating worms as most important and those at risk of worms but not malaria would treat malaria. It seems hard to know whom to prioritise then.
There’s also the eternal problem with imagining what one would choose—people often choose poorly. I assume you’re making some sort of assumptions choosing under the best possible conditions. It may be, though, that your values depend on your decision-making conditions.
Of course, you still have to choose and like you say it’s clear that 2 and 3 are both preferable to 1. I think this tool will get you answers most of the time, and can focus your mind on important questions, but there’s a intrinsic uncertainty (or maybe indeterminateness) about the ordering.
I would go for:
1) use their preferences and experiences (pretend you don’t know what you personally want)
2) imagine you knew everything you could about the impacts.
Which I think is considered the standard approach when thinking behind a veil.
As you say, you might find it hard to do 1) properly, but I think that effect is small in the scheme of things. It’s also better than not trying at all!
“This suggests that, behind a veil of ignorance where self-interest is not at play, those at risk of malaria but not worms might regard treating worms as most important and those at risk of worms but not malaria would treat malaria.”
Wouldn’t they then cancel out if you took the average of the two when deciding?