Note that a world where Insect suffering is 50% to be 10,000x as important as human suffering, and 50% to be 0.0001x as important as human suffering, is also a world where you can say exactly the same thing with humans and insects reversed.
That should make it clear that the ‘in expectation, [insects are] 5000x more important’ claim that follows is false, or more precisely requires additional assumptions.
This is the type of argument I was trying to eliminate when I wrote this:
I think this is a good point about precise phrasing, but I think the argument still basically goes through that insects should be treated as extremely important in expectation. You can eliminate the two envelope problem by either make the numbers fixed/concrete, or you can use conditional probabilities.
Namely,
“50% to be 10,000x as important as human suffering | insect suffering matters”
= 50% chance there’s huge stakes in the world, far more than we thought.
“50% to be 0.0001x as important as human suffering | insect suffering doesn’t matter at all”
= 50% chance the stakes are much smaller, in line with what we thought.
Which makes it clear the first world should be prioritized
More intuitively: suppose you thought there was an 50% chance you prevent a holocaust-level (10,000,000 lives) event happening to humans, but a 50% chance that this intervention would be completely useless. Alternately, you could do a normal intervention to save 1000 lives.
You could say “the normal intervention as a 50% chance to be ~infinitely more valuable than the holocaust-prevention thing”
But it’s obvious you should do the holocaust prevention thing. Because here it’s more obvious what the comparative/conditional stakes are. In one possible world, the ‘world you can affect’ is vastly larger, and that world should be prioritized.
Caveats: ignoring longtermist arguments, and the probability insects matter is << 50%
Note that a world where Insect suffering is 50% to be 10,000x as important as human suffering, and 50% to be 0.0001x as important as human suffering, is also a world where you can say exactly the same thing with humans and insects reversed.
That should make it clear that the ‘in expectation, [insects are] 5000x more important’ claim that follows is false, or more precisely requires additional assumptions.
This is the type of argument I was trying to eliminate when I wrote this:
https://forum.effectivealtruism.org/posts/atdmkTAnoPMfmHJsX/multiplier-arguments-are-often-flawed
I think this is a good point about precise phrasing, but I think the argument still basically goes through that insects should be treated as extremely important in expectation. You can eliminate the two envelope problem by either make the numbers fixed/concrete, or you can use conditional probabilities.
Namely, “50% to be 10,000x as important as human suffering | insect suffering matters” = 50% chance there’s huge stakes in the world, far more than we thought.
“50% to be 0.0001x as important as human suffering | insect suffering doesn’t matter at all” = 50% chance the stakes are much smaller, in line with what we thought.
Which makes it clear the first world should be prioritized
More intuitively: suppose you thought there was an 50% chance you prevent a holocaust-level (10,000,000 lives) event happening to humans, but a 50% chance that this intervention would be completely useless. Alternately, you could do a normal intervention to save 1000 lives.
You could say “the normal intervention as a 50% chance to be ~infinitely more valuable than the holocaust-prevention thing”
But it’s obvious you should do the holocaust prevention thing. Because here it’s more obvious what the comparative/conditional stakes are. In one possible world, the ‘world you can affect’ is vastly larger, and that world should be prioritized.
Caveats: ignoring longtermist arguments, and the probability insects matter is << 50%