Thanks a lot for the explanation! It does make more sense in context of the text, though to be clear this is extremely far from the original meaning of the phrase, and also the phrase has very negative connotations in our community. So I’d prefer it if future community members don’t use “Pascal’s mugging” to mean “a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities,” unless maybe it’s locally-scoped and clearly defined in the text to mean something that does not have the original technical meaning.
It is unfortunate that I can’t think of a better term on the top of my head for this concept however, would be interested in good suggestions.
a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities
What is the definition you’d prefer people to stick to? Something like “being pushed into actions that have a very low probability of producing value, because the reward would be extremely high in the unlikely event they did work out”?
The Drowning Child argument doesn’t seem like an example of Pascal’s Mugging, but Wikipedia gives the example of:
“give me five dollars, or I’ll use my magic powers from outside the Matrix to run a Turing machine that simulates and kills 3 ↑↑↑↑ 3”
being pushed into actions that have a very low probability of producing value, because the reward would be extremely high in the unlikely event they did work out
I haven’t watched the video, but I assumed it’s going to say “AI Safety is not a Pascal’s Mugging because the probability of AI x-risk is nontrivially high.” So someone who comes into the video with the assumption that AI risk is a clear Pascal’s Mugging since they view it as “a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities” would be pretty unhappy with the video and think that there was a bait-and-switch.
Thanks a lot for the explanation! It does make more sense in context of the text, though to be clear this is extremely far from the original meaning of the phrase, and also the phrase has very negative connotations in our community. So I’d prefer it if future community members don’t use “Pascal’s mugging” to mean “a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities,” unless maybe it’s locally-scoped and clearly defined in the text to mean something that does not have the original technical meaning.
It is unfortunate that I can’t think of a better term on the top of my head for this concept however, would be interested in good suggestions.
What is the definition you’d prefer people to stick to? Something like “being pushed into actions that have a very low probability of producing value, because the reward would be extremely high in the unlikely event they did work out”?
The Drowning Child argument doesn’t seem like an example of Pascal’s Mugging, but Wikipedia gives the example of:
and I think recent posts like The AI Messiah are gesturing at something like that (see, even, this video from the comments on that post: Is AI Safety a Pascal’s Mugging?).
Yes this is the definition I would prefer.
I haven’t watched the video, but I assumed it’s going to say “AI Safety is not a Pascal’s Mugging because the probability of AI x-risk is nontrivially high.” So someone who comes into the video with the assumption that AI risk is a clear Pascal’s Mugging since they view it as “a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities” would be pretty unhappy with the video and think that there was a bait-and-switch.