The problem with neglecting small probabilities is the same problem you get when neglecting small anything.
What benefit does a microlitre of water bring you if you’re extremely thirsty? Something so small it is equivalent to zero? Well if I offer you a microlitre of water a million times and you say ‘no thanks’ each time, then you’ve missed out! The rational way to value things is for a million microlitres to be worth the same as one litre. The 1000th microlitre doesn’t have to be worth the same as the 2000th, but their values have to add to the value of 1 litre. If they’re all zero then they can’t.
I think the same logic applies to valuing small probabilities. For instance, what is the value of one vote from the point of view of a political party? The chance of it swinging an election is tiny, but they’ll quickly go wrong if they assign all votes zero value.
I’m not sure what the solution to pascal’s mugging/fanatacism is. It’s really troubling. But maybe it’s something like penalising large effects with our priors? We don’t ignore small probabilities, we instead become extremely sceptical of large impacts (in proportion to the size of the claimed impact).
The problem with neglecting small probabilities is the same problem you get when neglecting small anything.
What benefit does a microlitre of water bring you if you’re extremely thirsty? Something so small it is equivalent to zero? Well if I offer you a microlitre of water a million times and you say ‘no thanks’ each time, then you’ve missed out! The rational way to value things is for a million microlitres to be worth the same as one litre. The 1000th microlitre doesn’t have to be worth the same as the 2000th, but their values have to add to the value of 1 litre. If they’re all zero then they can’t.
I think the same logic applies to valuing small probabilities. For instance, what is the value of one vote from the point of view of a political party? The chance of it swinging an election is tiny, but they’ll quickly go wrong if they assign all votes zero value.
I’m not sure what the solution to pascal’s mugging/fanatacism is. It’s really troubling. But maybe it’s something like penalising large effects with our priors? We don’t ignore small probabilities, we instead become extremely sceptical of large impacts (in proportion to the size of the claimed impact).