The Pascal’s Mugging thing has been discussed a lot around here. There isn’t an equivalence between all causes and muggings because the probabilities and outcomes are distinct and still matter. It’s not the case that every religion and every cause and every technology has the same tiny probability of the same large consequences, and you cannot satisfy every one of them because they have major opportunity costs. If you apply EV reasoning to cases like this then you just end up with a strong focus on one or a few of the highest impact issues (like AGI) at heavy short term cost. Unusual, but not a reductio ad absurdum.
There is no philosophical or formal system that properly describes human beliefs because human beliefs are messy, fuzzy neurophysiological phenomena. But we may choose to have a rational system for modeling our beliefs more consistently, and if we do then we may as well go with something that doesn’t give us obviously wrong implications in dutch book cases, because a belief system that has wrong implications does not fit our picture of ‘rational’ (whether we encounter those cases or not).
I agree that the probabilities matter, but then it comes to a question of how these are assessed and weighed against each other. On this basis, I don’t think it has been established that AGI safety research has strong claims to higher overall EV than other such potential mugging causes.
Regarding the Dutch book issue, I don’t really agree with the argument that ‘we may as well go with’ EV because it avoids these cases. Many people would argue that the limitations of the EV approach, such as having to give a precise probability for all beliefs and not being able to suspend judgement etc, also do not fit with our picture of ‘rational’. Its not obvious why hypothetical better behaviours are more important than these considerations. I am not pretending to resolve this argument but I am just trying to raise the issue as being relevant for assessing high impact, low probability events—EV is potentially problematic in such cases and we need to talk about this seriously.
The Pascal’s Mugging thing has been discussed a lot around here. There isn’t an equivalence between all causes and muggings because the probabilities and outcomes are distinct and still matter. It’s not the case that every religion and every cause and every technology has the same tiny probability of the same large consequences, and you cannot satisfy every one of them because they have major opportunity costs. If you apply EV reasoning to cases like this then you just end up with a strong focus on one or a few of the highest impact issues (like AGI) at heavy short term cost. Unusual, but not a reductio ad absurdum.
There is no philosophical or formal system that properly describes human beliefs because human beliefs are messy, fuzzy neurophysiological phenomena. But we may choose to have a rational system for modeling our beliefs more consistently, and if we do then we may as well go with something that doesn’t give us obviously wrong implications in dutch book cases, because a belief system that has wrong implications does not fit our picture of ‘rational’ (whether we encounter those cases or not).
Thanks for the comment!
I agree that the probabilities matter, but then it comes to a question of how these are assessed and weighed against each other. On this basis, I don’t think it has been established that AGI safety research has strong claims to higher overall EV than other such potential mugging causes.
Regarding the Dutch book issue, I don’t really agree with the argument that ‘we may as well go with’ EV because it avoids these cases. Many people would argue that the limitations of the EV approach, such as having to give a precise probability for all beliefs and not being able to suspend judgement etc, also do not fit with our picture of ‘rational’. Its not obvious why hypothetical better behaviours are more important than these considerations. I am not pretending to resolve this argument but I am just trying to raise the issue as being relevant for assessing high impact, low probability events—EV is potentially problematic in such cases and we need to talk about this seriously.