A Simpler Version of Pascal’s Mugging
Background: I found Bostrom’s original piece (https://www.nickbostrom.com/papers/pascal.pdf) unnecessarily confusing, and numerous Fellows in the EA VP Intro Fellowship have also been confused by it. I think we can be more accessible in our ideas.
I wrote this in about 30 minutes though, so it’s probably not very good. I would greatly appreciate feedback on how to improve it. I also can’t decide if it would be useful to have at the end a section of “possible solution” because as far as I can tell, theses solutions are all subject to complicated philosophical debate that goes over my head. So including it might be necessarily too confusing. Might be easiest to provide comments on the Google Doc itself (https://docs.google.com/document/d/1NLfDK7YqPGdYocxBsTX1QMldLNB4B-BvbT7sevPmzMk/edit)
Pascal is going about his day when he is approached by a mugger demanding Pascal’s wallet. Pascal refuses to give over his wallet, at which point the mugger offers the following deal: “Give me your wallet now and tomorrow I will give you twice as much money as is in the wallet now”
Pascal: “I have $100 in my wallet, but I don’t think it’s very likely you’re going to keep your promise”
Mugger: “What do you think is the probability that I keep my promise and give you the money?”
Pascal: “Hm, maybe 1 in a million because you might be some elaborate YouTube prankster”
Mugger: “Ok, then you give me your $100 now, and tomorrow I will give you $200 million”
Let’s do the math. We can calculate expected value by multiplying the value of an outcome by the probability of that outcome. The expected value of taking the deal, based on Pascal’s stated belief that the mugger will keep their word, is 200,000,000 * 1/(1,000,000) = $200. Whereas, the expected value of not taking the deal is $100 * 1 (certainty) = $100. Pascal should take the deal if he is an expected value maximizing person.
Maybe at this point Pascal realizes that the chances of the mugger having 200 million dollars is extremely low. But this doesn’t change the conundrum because the mugger will simply offer more money to account for the lower probability of them following through. For example, Pascal thinks the probability of the mugger having the money decreases the chance of the mugger following through to one in a trillion. Then the mugger offers 200 trillion dollars.
The mugger is capitalizing on the fact that everything we know, we know with a probability less than one. We can not be 100% certain that the mugger won’t follow through on their promise, even though we intuitively know they won’t. Extremely unlikely outcomes are still possible.
Pascal: “200 trillion dollars is too much money, in fact I don’t think I would benefit from having any more than 10 million dollars”
Pascal is drawing a distinction between expected value (uses units of money) and expected utility (uses units of happiness, satisfaction, other things we find intrinsically valuable), but the mugger is unphased.
Mugger: “Okay, but you do value happy days of life in such a way where more happy days is always better than fewer happy days. It turns out that I’m a wizard and I can grant you 200 trillion happy days of life in exchange for your wallet”
Pascal: “It seems extremely unlikely that you’re a wizard, but the amount I value 200 trillion happy days of life is so high that the expected utility is still positive, and greater than what I get from just keeping my $100”
Pascal hands his wallet to the mugger but doesn’t feel very good about doing so.
So what’s the moral of this story?
-Expected value is not a perfect system for making decisions, because we all know Pascal is getting duped.
-We should be curious and careful about how to deal with low probability events with super high or low expected value (like extinction risks). Relatedly, common sense seems to suggest that spending effort on too unlikely scenarios is irrational
A Simpler Version of Pascal’s Mugging Background: I found Bostrom’s original piece (https://www.nickbostrom.com/papers/pascal.pdf) unnecessarily confusing, and numerous Fellows in the EA VP Intro Fellowship have also been confused by it. I think we can be more accessible in our ideas. I wrote this in about 30 minutes though, so it’s probably not very good. I would greatly appreciate feedback on how to improve it. I also can’t decide if it would be useful to have at the end a section of “possible solution” because as far as I can tell, theses solutions are all subject to complicated philosophical debate that goes over my head. So including it might be necessarily too confusing. Might be easiest to provide comments on the Google Doc itself (https://docs.google.com/document/d/1NLfDK7YqPGdYocxBsTX1QMldLNB4B-BvbT7sevPmzMk/edit)
Pascal is going about his day when he is approached by a mugger demanding Pascal’s wallet. Pascal refuses to give over his wallet, at which point the mugger offers the following deal: “Give me your wallet now and tomorrow I will give you twice as much money as is in the wallet now” Pascal: “I have $100 in my wallet, but I don’t think it’s very likely you’re going to keep your promise” Mugger: “What do you think is the probability that I keep my promise and give you the money?” Pascal: “Hm, maybe 1 in a million because you might be some elaborate YouTube prankster” Mugger: “Ok, then you give me your $100 now, and tomorrow I will give you $200 million” Let’s do the math. We can calculate expected value by multiplying the value of an outcome by the probability of that outcome. The expected value of taking the deal, based on Pascal’s stated belief that the mugger will keep their word, is 200,000,000 * 1/(1,000,000) = $200. Whereas, the expected value of not taking the deal is $100 * 1 (certainty) = $100. Pascal should take the deal if he is an expected value maximizing person. Maybe at this point Pascal realizes that the chances of the mugger having 200 million dollars is extremely low. But this doesn’t change the conundrum because the mugger will simply offer more money to account for the lower probability of them following through. For example, Pascal thinks the probability of the mugger having the money decreases the chance of the mugger following through to one in a trillion. Then the mugger offers 200 trillion dollars. The mugger is capitalizing on the fact that everything we know, we know with a probability less than one. We can not be 100% certain that the mugger won’t follow through on their promise, even though we intuitively know they won’t. Extremely unlikely outcomes are still possible.
Pascal: “200 trillion dollars is too much money, in fact I don’t think I would benefit from having any more than 10 million dollars” Pascal is drawing a distinction between expected value (uses units of money) and expected utility (uses units of happiness, satisfaction, other things we find intrinsically valuable), but the mugger is unphased.
Mugger: “Okay, but you do value happy days of life in such a way where more happy days is always better than fewer happy days. It turns out that I’m a wizard and I can grant you 200 trillion happy days of life in exchange for your wallet” Pascal: “It seems extremely unlikely that you’re a wizard, but the amount I value 200 trillion happy days of life is so high that the expected utility is still positive, and greater than what I get from just keeping my $100” Pascal hands his wallet to the mugger but doesn’t feel very good about doing so.
So what’s the moral of this story? -Expected value is not a perfect system for making decisions, because we all know Pascal is getting duped. -We should be curious and careful about how to deal with low probability events with super high or low expected value (like extinction risks). Relatedly, common sense seems to suggest that spending effort on too unlikely scenarios is irrational