It’s worth noting that if the uncertainty is regarded as wholly factual (which is unrealistic, of course), then the insect wager does work, since then there’s only a single unit of moral value in play.
Under some assumptions, that might be correct, but not in the general case. In the general case, the degree of sentience of a human compared to an insect could plausibly be from 1 to 10^9+.
To push through the reasoning about the wager, you need an extra assumption—that none of your probability mass has the human being much more important (or sentient) than the insect. You need to enforce an upper-bound ratio between importance of humans compared to insects. And that’s not specific to the moral or empirical question, unless you argue from other philosophical assumptions like certain types of property dualism, but you ought to also apply some uncertainty over those things.
Yeah. My point was that the “two envelopes problem for moral uncertainty” doesn’t apply to factual uncertainty. But certainly what conclusion results from applying factual uncertainty depends on one’s priors.
I agree that the two-envelope problem doesn’t apply to factual uncertainty. The whole thing gets significantly complicated by situations where it’s hard to specify how much of the uncertainty is moral and how much is factual.
I regard getting some good and usable answers to this as an important research topic within philosophy. Given that questions of pure moral uncertainty are still unresolved, you’d probably need to make assumptions about that half of things in order to get going.
Under some assumptions, that might be correct, but not in the general case. In the general case, the degree of sentience of a human compared to an insect could plausibly be from 1 to 10^9+.
To push through the reasoning about the wager, you need an extra assumption—that none of your probability mass has the human being much more important (or sentient) than the insect. You need to enforce an upper-bound ratio between importance of humans compared to insects. And that’s not specific to the moral or empirical question, unless you argue from other philosophical assumptions like certain types of property dualism, but you ought to also apply some uncertainty over those things.
Yeah. My point was that the “two envelopes problem for moral uncertainty” doesn’t apply to factual uncertainty. But certainly what conclusion results from applying factual uncertainty depends on one’s priors.
I agree that the two-envelope problem doesn’t apply to factual uncertainty. The whole thing gets significantly complicated by situations where it’s hard to specify how much of the uncertainty is moral and how much is factual.
I regard getting some good and usable answers to this as an important research topic within philosophy. Given that questions of pure moral uncertainty are still unresolved, you’d probably need to make assumptions about that half of things in order to get going.