Regarding “Pascal’s Mugging”: I am not the author, so I might well be mistaken. But I think I can relate to the intended meaning more closely than “vaguely shady”
One paragraph is
EA may not in fact be a form of Pascal’s Mugging or fanaticism, but if you take certain presentations of longtermism and X-risk seriously, the demands are sufficiently large that it certainly pattern-matches pretty well to these.
which I read as: “Pascal’s mugging” describes a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities. I think that this in itself need not be problematic (there can be huge stakes which warrant change in behaviour), but if there is social pressure involved in forcing people to accept the premise of huge moral stakes, things become problematic.
One example is the “child drowning in a pond” thought experiment. It does introduce large moral stakes (the resources you use for conveniences in everyday life could in fact be used to help people in urgent need; and in the thought experiment itself you would decide that the latter is more important) and can be used to imply significant behavioural changes (putting a large fraction of one’s resources to helping worse-off people). If this argument is presented with strong social pressure to not voice objections, that would be a situation which fits under Pascal-mugging in my understanding.
If people are used to this type of rhetorical move, they will become wary as soon as anything along the lines of “there are huge moral stakes which you are currently ignoring and you should completely change your life-goals” is mentioned to them. Assuming this, I think the worry that
[...] the demands are sufficiently large that it certainly pattern-matches pretty well to these.
Thanks a lot for the explanation! It does make more sense in context of the text, though to be clear this is extremely far from the original meaning of the phrase, and also the phrase has very negative connotations in our community. So I’d prefer it if future community members don’t use “Pascal’s mugging” to mean “a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities,” unless maybe it’s locally-scoped and clearly defined in the text to mean something that does not have the original technical meaning.
It is unfortunate that I can’t think of a better term on the top of my head for this concept however, would be interested in good suggestions.
a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities
What is the definition you’d prefer people to stick to? Something like “being pushed into actions that have a very low probability of producing value, because the reward would be extremely high in the unlikely event they did work out”?
The Drowning Child argument doesn’t seem like an example of Pascal’s Mugging, but Wikipedia gives the example of:
“give me five dollars, or I’ll use my magic powers from outside the Matrix to run a Turing machine that simulates and kills 3 ↑↑↑↑ 3”
being pushed into actions that have a very low probability of producing value, because the reward would be extremely high in the unlikely event they did work out
I haven’t watched the video, but I assumed it’s going to say “AI Safety is not a Pascal’s Mugging because the probability of AI x-risk is nontrivially high.” So someone who comes into the video with the assumption that AI risk is a clear Pascal’s Mugging since they view it as “a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities” would be pretty unhappy with the video and think that there was a bait-and-switch.
Regarding “Pascal’s Mugging”:
I am not the author, so I might well be mistaken. But I think I can relate to the intended meaning more closely than “vaguely shady”
One paragraph is
which I read as: “Pascal’s mugging” describes a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities. I think that this in itself need not be problematic (there can be huge stakes which warrant change in behaviour), but if there is social pressure involved in forcing people to accept the premise of huge moral stakes, things become problematic.
One example is the “child drowning in a pond” thought experiment. It does introduce large moral stakes (the resources you use for conveniences in everyday life could in fact be used to help people in urgent need; and in the thought experiment itself you would decide that the latter is more important) and can be used to imply significant behavioural changes (putting a large fraction of one’s resources to helping worse-off people).
If this argument is presented with strong social pressure to not voice objections, that would be a situation which fits under Pascal-mugging in my understanding.
If people are used to this type of rhetorical move, they will become wary as soon as anything along the lines of “there are huge moral stakes which you are currently ignoring and you should completely change your life-goals” is mentioned to them. Assuming this, I think the worry that
makes a lot of sense.
Thanks a lot for the explanation! It does make more sense in context of the text, though to be clear this is extremely far from the original meaning of the phrase, and also the phrase has very negative connotations in our community. So I’d prefer it if future community members don’t use “Pascal’s mugging” to mean “a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities,” unless maybe it’s locally-scoped and clearly defined in the text to mean something that does not have the original technical meaning.
It is unfortunate that I can’t think of a better term on the top of my head for this concept however, would be interested in good suggestions.
What is the definition you’d prefer people to stick to? Something like “being pushed into actions that have a very low probability of producing value, because the reward would be extremely high in the unlikely event they did work out”?
The Drowning Child argument doesn’t seem like an example of Pascal’s Mugging, but Wikipedia gives the example of:
and I think recent posts like The AI Messiah are gesturing at something like that (see, even, this video from the comments on that post: Is AI Safety a Pascal’s Mugging?).
Yes this is the definition I would prefer.
I haven’t watched the video, but I assumed it’s going to say “AI Safety is not a Pascal’s Mugging because the probability of AI x-risk is nontrivially high.” So someone who comes into the video with the assumption that AI risk is a clear Pascal’s Mugging since they view it as “a rhetorical move that introduces huge moral stakes into the world-view in order to push people into drastically altering their actions and priorities” would be pretty unhappy with the video and think that there was a bait-and-switch.