I especially appreciated this quote, which helped me verbalize one of my own intuitions:
It would be naive to say something like this: āEven if the probability of extinction is one in a billion and the probability of preventing it is one in a trillion, we should still prioritize x-risk reduction over everything elseā¦ā The reason it would be naive is that tiny probabilities of vast utilities pose a serious unsolved problem, and there are various potential solutions that would undermine any project with probabilities so low.
On another thread, I laid out my own approach to thinking about Pascalās Mugging: Essentially, I start to penalize or even ignore probabilities of averting extinction once they become so unlikely that humanity canāt ābuyā enough of them to have a reasonable chance of survival (definitions of āreasonableā may vary).
Another way to think about this: I donāt think weāre very good at determining the difference between āone in a trillionā and āone in a hundred trillionā when those kinds of determinations require speculative models (rather than using well-known physical laws). And the smaller the odds, the worse our predictions become. If a mugger tells me Iāve got a 1/ā10^10^10 chance of saving his universe, I canāt think of any way to verify whether that number is accurate; it might as well be 1/ā10^10^10^10.
This is something like āreject open-mindednessā, but itās not so much a rejection as an avoidance, fueled by confusion about very small numbers and doubt that humans will be able to distinguish probabilities like that prior to the advent of an AGI powerful enough to make our philosophy seem meaningless. (Fortunately for me, I think that my best giving opportunities are likely enough to prevent extinction that I donāt need to do mugging math.)
Itās plausible that Iāve just repeated something from the original Less Wrong discussion about this, and Iād love to hear better ideas from anyone on how to stop avoiding open-mindedness.
I especially appreciated this quote, which helped me verbalize one of my own intuitions:
On another thread, I laid out my own approach to thinking about Pascalās Mugging: Essentially, I start to penalize or even ignore probabilities of averting extinction once they become so unlikely that humanity canāt ābuyā enough of them to have a reasonable chance of survival (definitions of āreasonableā may vary).
Another way to think about this: I donāt think weāre very good at determining the difference between āone in a trillionā and āone in a hundred trillionā when those kinds of determinations require speculative models (rather than using well-known physical laws). And the smaller the odds, the worse our predictions become. If a mugger tells me Iāve got a 1/ā10^10^10 chance of saving his universe, I canāt think of any way to verify whether that number is accurate; it might as well be 1/ā10^10^10^10.
This is something like āreject open-mindednessā, but itās not so much a rejection as an avoidance, fueled by confusion about very small numbers and doubt that humans will be able to distinguish probabilities like that prior to the advent of an AGI powerful enough to make our philosophy seem meaningless. (Fortunately for me, I think that my best giving opportunities are likely enough to prevent extinction that I donāt need to do mugging math.)
Itās plausible that Iāve just repeated something from the original Less Wrong discussion about this, and Iād love to hear better ideas from anyone on how to stop avoiding open-mindedness.