The dynamics of expected-value calculations have led many EAs to worry about whether it might provide a moral justification for fanatics—people working on a problem with an extremely low chance of success but an extremely high expected value. This situation was famously imagined in Pascal’s Mugging. An easy solution to Pascal’s Mugging is to adopt the De Minimis principle, which holds that certain events are simply too rare to be considered. While EAs often reject this principle as clumsy, groundless, and arbitrary, I believe that the De Minimis principle has a very straightforward application when it comes to fanatical beliefs (“credences”), as opposed to traditional probabilities, which refer to events—specifically, I hypothesize that the “minimis” in question is around 5%.
What is Probability?
Before proceeding, it is useful to examine a simple probabilistic game. Imagine that you and I flip a coin, and agree that I will “win” if I flip heads. All of the elements of the game—bet, odds, and payoff—correlate with some feature of the real world. When I claim to have a 50% chance of success, I mean that the coin has two sides, and that one of them will secure a victory, giving me a fraction of success equal to ½.
The same thing holds for rarer, more complex events, since extensive statistical records allow us to make informed guesses as to features of the real world, which are sometimes exceedingly subtle. Roughly speaking, the sheer quantity of large asteroids whizzing around the universe gives us a numerator, while the vastness of space gives us a denominator, with the ratio representing the odds of a direct hit. In precisely the same way that we directly examine a coin to observe how many sides it has, we may indirectly estimate the asteroid impacts by studying the geological record.
What are Credences?
However, there is a second type of probability, which I will call a probability of belief, which does not refer to a feature of the world. Rather, it refers to a person’s own mind, which, unlike the real world, is often characterized by a certain fuzziness and imprecision. As an experiment, try to imagine how much would you be willing to pay for a cup of coffee right now. $2.50? How about $2.51? Or $2.5000001? The attempt to determine an exact price is exhausting, and probably Sisyphean.
This understanding is reflected in our casual unwillingness, when referring to our own opinions, to stray from a few common probabilities that have special places in our hearts—such as 50%, 75%, 80%, 99%, and 100%.
Perhaps this signifies our laziness when interrogating our own beliefs. More likely, I think, is that we feel called to those probabilities because they accurately represent our feelings. 50% means, “I don’t know if it will happen or not.” 75% means, “I’m pretty sure this will happen.” 80% means, “I’d be surprised if this didn’t happen.” 99% means, “This will definitely happen, but I’ll retain a little humility.” 100% means, “Don’t be an idiot.” After these probabilities are passed around enough, they become well-worn and acquire a familiar meaning.
57.5%, however, has no such history and no such meaning. This is more than semantics—I suspect that it does not even have meaning to the person who uses that probability to describe their own state of mind. Nobody has yet invented words capable of describing a “57.5% credence” that could not also describe “57.4% credence”, which probably indicates that not enough people are capable of distinguishing those two feelings to begin with.
Where’s the Line?
It is interesting to notice that until 2025 the Treasury did not mint anything below a penny—even though minting and distributing the penny had been prohibitively expensive for years. Presumably, the Treasury was willing to subsidize a certain level of specificity in the name of consumer convenience, but no more. Now that the penny has been discontinued, the nickel (which also costs more than its value to produce) may be taken as a reasonable estimate of the specificity with which modern-day Americans are able to quantify their desires in the marketplace.
This phenomenon is also reflected in the tendency of flea-markets and high-end restaurants only to charge in multiples of $1, on the premise that it is somewhat petty to force consumers to decide what precisely they are willing to pay for a certain good.
Personally, I find it impossibly difficult to internally distinguish between credences separated by less than 5 percentage points, with 0%, 5%, 10%, and so on serving as important mile-markers. I might accept that certain people, like gamblers or insurance executives, are more familiar with what certain probabilities of belief mean, or, if you prefer, what states of mind they correspond to. However, I suspect that they too have an inviolable “floor”—the odds of a given card appearing from a freshly shuffled deck, for instance, is around 2%.
For the rest of us, when determining what level of specificity of belief we may reasonably claim to hold, it is instructive to think about probabilities that we regularly anticipate. Here’s a brief chart of common events attached to their probabilities:
100% - Americans that are Americans
10% - Americans that are left-handed
1% - Americans that are twins
.1% - Americans that are polydactylic
.01% - Odds that you will die in a mass shooting
.001% - Odds that you’ll get hit by lightning sometime in your life
.0001% - Odds of a fatal car crash in good weather
.00001% - Odds that a given plane will crash
.000001% - Odds of being born with a water allergy
I know many Americans, several left-handed people, one set of twins, and no one who is polydactylic (as far as I know). When someone tells me that they believe something has a 10% chance of happening, I sometimes imagine meeting someone for the first time and noticing that they are left-handed. This shared bit of knowledge allows me to access what my partner means (hopefully, he also has a similar point of reference for his number, and isn’t simply making it up).
Rationally, I am aware that the events below 10% are possible, but I do not take them especially seriously—emotionally, they are all clustered around 1%, in the category of things that could happen, but never do. Because I have never experienced them, I have no numerator in my ratio of probability. So without a functional fraction, I cannot associate their supposed probability with any sense of anticipation, excitement, dread, etc.—beyond the usual vague fears of flying and hopes of winning the lottery, which have nothing to do with the statistics, and would not meaningfully change if their odds shifted by an order of magnitude up or down.
As an example, it’s interesting to imagine what would happen if the TSA announced: “The odds of your plane crashing are now .00001%—just the same as the odds that your name starts with the letter U!” Presumably, this second piece of information would be more meaningful than the first. Everyone who knew someone named “Ursula” would surely rush for the exits. But what if the TSA announced: “The odds of your plane crashing are now .001%—just the same as the odds that you’ll get hit by lightning!” Then there might only be some anxious glances and shrugs.
You might be tempted to write this off as statistical illiteracy. In the case of a plane crash, which is a real-world event, it certainly would be. But when we are using statistics as analogies for our own mental states, it is vitally important that we only use statistics insofar as they really correlate to mental states, both our audience’s and our own. The airport example implies that a .001% chance “feels” like zero, just like the odds of getting hit by lightning. Being given a name starting with the letter U, however, is suddenly within the realm of the possible, and so develops a meaningful charge.
To insist on using probabilities that are meaningless, both to you and your audience, would be like claiming that you were feeling “apple” or thinking “zebra,” and that it was impossible to elaborate—in response to which you friends and family could reasonably accuse you of being deliberately annoying.
Implications for EA
In a critical article about EA in the London Review of Books, the philosopher Amia Srinavasan bashed longtermism thus:
“The expected value of reducing an x-risk by one billionth of one billionth of a percentage point (that’s 0.0000000000000000001 per cent) is still a hundred billion times greater than the value of saving the lives of a billion people living now. So it turns out to be better to try to prevent some hypothetical x-risk, even with an extremely remote chance of being able to do so, than to help actual living people. ”
It is difficult to know how seriously to take this objection, which is often put forward by critics of EA. On one hand, EA has always ruffled feathers by its unwavering consistency—some critics seem to feel that donating abroad is a step too far. On the other hand, EAs themselves are often unable to fully commit to risk-neutrality. While it is often held up as the “correct” philosophical move, even hardcore EAs often squirm when asked to commit their careers to a field like wild animal welfare, which is harassed by risk from all directions.
If we take Amia’s problem as given, and the probabilities as referring to features of the universe, then I think the logic holds: it should be worth ignoring a fairly large amount of people today for a small chance of helping a tremendous number of people in the far future. (With a million caveats—I don’t, for instance, believe in the simple equation of “pain” with “pleasure,” or that the distant future will necessarily be overwhelmingly good for people, animals, aliens, or anyone else.)
However, if the longtermist has arrived at that miniscule number himself, a la Pascal’s Mugging, then I’m not sure his calculation should hold any water at all. Does the longtermist claim that his personal contribution will reduce the chance of an x-risk by “one billionth of one billionth of a percentage point”? How did he arrive at that number? How can he possibly wrap his head around a probability that small?
More to the point, how can he be sure that there aren’t a couple dozen more zeroes on there—a result which should lead him to drop everything and work on Global Health and Development? Such a specific number leads me to mistrust his (or really Amia’s) motives; I worry that s/he has contrived that number in order to spit out a certain result.
And of course, she has. But it’s not an unfair parody, because we have, too. In pretending to be confused by fanaticism, or by thought experiments like Pascal’s Mugging, we are refusing to acknowledge the fact that most of us, confronted with the mugger, would have exactly a 0% credence in his claims. Personally, if I were approached by the Mugger, I would be choosing between 0% and 5%—and I think we can all agree that the first option would be more appropriate. Human minds cannot conceive of a probability small enough to induce us to pay the mugger.
If you doubt this, then go out and “mug” the most rational of your EA friends. In the face of real-world penalties, they will drop the pretense and admit that they do not know how to believe there is a .000000001% chance you are telling the truth. They simply do not believe you at all.
Conclusion
EAs should have no truck with small credences—not even to make a point. Many animal advocates I know believe that there is a serious chance their efforts will be critical to ending factory farming; the same is true of many AI researchers and pandemic specialists in preventing global catastrophe. Personally, I am motivated by something like a 10-30% chance that my efforts will be critical to saving hundreds of thousands of animals. This may be too rosy, or even arrogant, but it accurately represents my internal belief, which is all I meant to communicate in the first place. If you claim to believe your efforts have only a very tiny chance of saving lives (below, say, .1%) then I would encourage you to consider whether you really mean “0%,” and are using this incomprehensibly small probability as a crutch. Alternatively, I would ask you to compare your current credence with an infinitely smaller credence, which would necessitate changing your career, and ask how you “know” that you have the first credence, rather than the second.
Beware of Small Numbers
Abstract
The dynamics of expected-value calculations have led many EAs to worry about whether it might provide a moral justification for fanatics—people working on a problem with an extremely low chance of success but an extremely high expected value. This situation was famously imagined in Pascal’s Mugging. An easy solution to Pascal’s Mugging is to adopt the De Minimis principle, which holds that certain events are simply too rare to be considered. While EAs often reject this principle as clumsy, groundless, and arbitrary, I believe that the De Minimis principle has a very straightforward application when it comes to fanatical beliefs (“credences”), as opposed to traditional probabilities, which refer to events—specifically, I hypothesize that the “minimis” in question is around 5%.
What is Probability?
Before proceeding, it is useful to examine a simple probabilistic game. Imagine that you and I flip a coin, and agree that I will “win” if I flip heads. All of the elements of the game—bet, odds, and payoff—correlate with some feature of the real world. When I claim to have a 50% chance of success, I mean that the coin has two sides, and that one of them will secure a victory, giving me a fraction of success equal to ½.
The same thing holds for rarer, more complex events, since extensive statistical records allow us to make informed guesses as to features of the real world, which are sometimes exceedingly subtle. Roughly speaking, the sheer quantity of large asteroids whizzing around the universe gives us a numerator, while the vastness of space gives us a denominator, with the ratio representing the odds of a direct hit. In precisely the same way that we directly examine a coin to observe how many sides it has, we may indirectly estimate the asteroid impacts by studying the geological record.
What are Credences?
However, there is a second type of probability, which I will call a probability of belief, which does not refer to a feature of the world. Rather, it refers to a person’s own mind, which, unlike the real world, is often characterized by a certain fuzziness and imprecision. As an experiment, try to imagine how much would you be willing to pay for a cup of coffee right now. $2.50? How about $2.51? Or $2.5000001? The attempt to determine an exact price is exhausting, and probably Sisyphean.
This understanding is reflected in our casual unwillingness, when referring to our own opinions, to stray from a few common probabilities that have special places in our hearts—such as 50%, 75%, 80%, 99%, and 100%.
Perhaps this signifies our laziness when interrogating our own beliefs. More likely, I think, is that we feel called to those probabilities because they accurately represent our feelings. 50% means, “I don’t know if it will happen or not.” 75% means, “I’m pretty sure this will happen.” 80% means, “I’d be surprised if this didn’t happen.” 99% means, “This will definitely happen, but I’ll retain a little humility.” 100% means, “Don’t be an idiot.” After these probabilities are passed around enough, they become well-worn and acquire a familiar meaning.
57.5%, however, has no such history and no such meaning. This is more than semantics—I suspect that it does not even have meaning to the person who uses that probability to describe their own state of mind. Nobody has yet invented words capable of describing a “57.5% credence” that could not also describe “57.4% credence”, which probably indicates that not enough people are capable of distinguishing those two feelings to begin with.
Where’s the Line?
It is interesting to notice that until 2025 the Treasury did not mint anything below a penny—even though minting and distributing the penny had been prohibitively expensive for years. Presumably, the Treasury was willing to subsidize a certain level of specificity in the name of consumer convenience, but no more. Now that the penny has been discontinued, the nickel (which also costs more than its value to produce) may be taken as a reasonable estimate of the specificity with which modern-day Americans are able to quantify their desires in the marketplace.
This phenomenon is also reflected in the tendency of flea-markets and high-end restaurants only to charge in multiples of $1, on the premise that it is somewhat petty to force consumers to decide what precisely they are willing to pay for a certain good.
Personally, I find it impossibly difficult to internally distinguish between credences separated by less than 5 percentage points, with 0%, 5%, 10%, and so on serving as important mile-markers. I might accept that certain people, like gamblers or insurance executives, are more familiar with what certain probabilities of belief mean, or, if you prefer, what states of mind they correspond to. However, I suspect that they too have an inviolable “floor”—the odds of a given card appearing from a freshly shuffled deck, for instance, is around 2%.
For the rest of us, when determining what level of specificity of belief we may reasonably claim to hold, it is instructive to think about probabilities that we regularly anticipate. Here’s a brief chart of common events attached to their probabilities:
100% - Americans that are Americans
10% - Americans that are left-handed
1% - Americans that are twins
.1% - Americans that are polydactylic
.01% - Odds that you will die in a mass shooting
.001% - Odds that you’ll get hit by lightning sometime in your life
.0001% - Odds of a fatal car crash in good weather
.00001% - Odds that a given plane will crash
.000001% - Odds of being born with a water allergy
I know many Americans, several left-handed people, one set of twins, and no one who is polydactylic (as far as I know). When someone tells me that they believe something has a 10% chance of happening, I sometimes imagine meeting someone for the first time and noticing that they are left-handed. This shared bit of knowledge allows me to access what my partner means (hopefully, he also has a similar point of reference for his number, and isn’t simply making it up).
Rationally, I am aware that the events below 10% are possible, but I do not take them especially seriously—emotionally, they are all clustered around 1%, in the category of things that could happen, but never do. Because I have never experienced them, I have no numerator in my ratio of probability. So without a functional fraction, I cannot associate their supposed probability with any sense of anticipation, excitement, dread, etc.—beyond the usual vague fears of flying and hopes of winning the lottery, which have nothing to do with the statistics, and would not meaningfully change if their odds shifted by an order of magnitude up or down.
As an example, it’s interesting to imagine what would happen if the TSA announced: “The odds of your plane crashing are now .00001%—just the same as the odds that your name starts with the letter U!” Presumably, this second piece of information would be more meaningful than the first. Everyone who knew someone named “Ursula” would surely rush for the exits. But what if the TSA announced: “The odds of your plane crashing are now .001%—just the same as the odds that you’ll get hit by lightning!” Then there might only be some anxious glances and shrugs.
You might be tempted to write this off as statistical illiteracy. In the case of a plane crash, which is a real-world event, it certainly would be. But when we are using statistics as analogies for our own mental states, it is vitally important that we only use statistics insofar as they really correlate to mental states, both our audience’s and our own. The airport example implies that a .001% chance “feels” like zero, just like the odds of getting hit by lightning. Being given a name starting with the letter U, however, is suddenly within the realm of the possible, and so develops a meaningful charge.
To insist on using probabilities that are meaningless, both to you and your audience, would be like claiming that you were feeling “apple” or thinking “zebra,” and that it was impossible to elaborate—in response to which you friends and family could reasonably accuse you of being deliberately annoying.
Implications for EA
In a critical article about EA in the London Review of Books, the philosopher Amia Srinavasan bashed longtermism thus:
“The expected value of reducing an x-risk by one billionth of one billionth of a percentage point (that’s 0.0000000000000000001 per cent) is still a hundred billion times greater than the value of saving the lives of a billion people living now. So it turns out to be better to try to prevent some hypothetical x-risk, even with an extremely remote chance of being able to do so, than to help actual living people. ”
It is difficult to know how seriously to take this objection, which is often put forward by critics of EA. On one hand, EA has always ruffled feathers by its unwavering consistency—some critics seem to feel that donating abroad is a step too far. On the other hand, EAs themselves are often unable to fully commit to risk-neutrality. While it is often held up as the “correct” philosophical move, even hardcore EAs often squirm when asked to commit their careers to a field like wild animal welfare, which is harassed by risk from all directions.
If we take Amia’s problem as given, and the probabilities as referring to features of the universe, then I think the logic holds: it should be worth ignoring a fairly large amount of people today for a small chance of helping a tremendous number of people in the far future. (With a million caveats—I don’t, for instance, believe in the simple equation of “pain” with “pleasure,” or that the distant future will necessarily be overwhelmingly good for people, animals, aliens, or anyone else.)
However, if the longtermist has arrived at that miniscule number himself, a la Pascal’s Mugging, then I’m not sure his calculation should hold any water at all. Does the longtermist claim that his personal contribution will reduce the chance of an x-risk by “one billionth of one billionth of a percentage point”? How did he arrive at that number? How can he possibly wrap his head around a probability that small?
More to the point, how can he be sure that there aren’t a couple dozen more zeroes on there—a result which should lead him to drop everything and work on Global Health and Development? Such a specific number leads me to mistrust his (or really Amia’s) motives; I worry that s/he has contrived that number in order to spit out a certain result.
And of course, she has. But it’s not an unfair parody, because we have, too. In pretending to be confused by fanaticism, or by thought experiments like Pascal’s Mugging, we are refusing to acknowledge the fact that most of us, confronted with the mugger, would have exactly a 0% credence in his claims. Personally, if I were approached by the Mugger, I would be choosing between 0% and 5%—and I think we can all agree that the first option would be more appropriate. Human minds cannot conceive of a probability small enough to induce us to pay the mugger.
If you doubt this, then go out and “mug” the most rational of your EA friends. In the face of real-world penalties, they will drop the pretense and admit that they do not know how to believe there is a .000000001% chance you are telling the truth. They simply do not believe you at all.
Conclusion
EAs should have no truck with small credences—not even to make a point. Many animal advocates I know believe that there is a serious chance their efforts will be critical to ending factory farming; the same is true of many AI researchers and pandemic specialists in preventing global catastrophe. Personally, I am motivated by something like a 10-30% chance that my efforts will be critical to saving hundreds of thousands of animals. This may be too rosy, or even arrogant, but it accurately represents my internal belief, which is all I meant to communicate in the first place. If you claim to believe your efforts have only a very tiny chance of saving lives (below, say, .1%) then I would encourage you to consider whether you really mean “0%,” and are using this incomprehensibly small probability as a crutch. Alternatively, I would ask you to compare your current credence with an infinitely smaller credence, which would necessitate changing your career, and ask how you “know” that you have the first credence, rather than the second.