Are we probably in the middle of humanity? (Five anthropics thought experiments)
Edit: in the time since writing this post I realized people way smarter than me have already thought about this stuff (which I should have expected), see here, here and here. You should read those instead of this!
Original post follows.
Around 120 billion people have ever lived. Given this number, should we be extremely skeptical of claims that 10^50 people might live in the future? What are the odds that weâd just happen to live among the first 1/â10^38th of people who will ever exist? Similarly, should we be skeptical of claims that humanity is ending in the next few years, which would suggest weâre among the last 5% of all humans?
If you donât really buy this sort of reasoning to begin with, hereâs an example of its usefulness. During World War II, the Allies wanted to determine how rapidly Germany was making tanks. After discovering that the serial numbers on German tank parts were sequentially numbered (e.g., 1, 2, 3...) some Allied statisticians obtained surprisingly accurate estimates of the speed of German tank production by looking at the numbers on just 2 captured tanks. (For example, if theyâd have seen the numbers 1012 and 1041 on the different tanks, theyâd have been very surprised to learn the Germans were pumping out 500 tanks a day.)
Of course, this example doesnât exactly mirror my questionâIâm not interested in how fast new people are being brought into existence, but where in the complete distribution of people over time we are most likely to be. Answering this question would also tell us something about how many people we should expect to ever exist, and vice versa. Another way my question differs is this: In contemplating how many total people are likely to exist, we can only rely on one observationâour own consciousnessâbecause this is the only plausibly random sample of humanity available to us. Unlike statisticians inspecting tanks, we cannot assume the people we encounter are an unbiased, random subset of all humans who will ever exist, since our sampling process is confounded by our unfortunate access to only the past and the present. But despite these differences, it still feels like a clever Bayesian might be able to use the knowledge that theyâre the 120-billionth person as some kind of evidence about the total number of people who will ever live. Right?
Two competing perspectives:
One perspective says that regardless of the number of total people who will ever exist, the odds of being any particular person in that universe are equally likely. If you generate a random number between 1 and 100, you shouldnât be any more surprised if you get 1 than if you get 50 - each had a 1% chance. The same goes if you generate a number between 1 and 1 trillion: getting 1 shouldnât be any more surprising than getting 4,011,218,693. According to this perspective, we should be totally agnostic about how many people should exist, since regardless of the answer, our existence at the present point in humanityâs timeline is not surprising.
But a second perspective says that our sole observation of consciousness around the 120-billion-people mark can tell us about the relative likelihood of competing views about humanityâs lifespan. Suppose I bring you a bag full of green and red balls and have you draw one. Itâs red. Then I tell you the bag either had 1 red ball and 4 green balls, or 1 red ball and 499 green balls (uh, itâs a magic Mary Poppins bag, so you really couldnât tell how many total balls were in there). Which seems more plausibleâthat there were 5 total balls and you drew the 1â5 chance red ball, or that there were 500 balls and you drew the 1â500 chance red ball? Similarly, the second anthropic perspective asserts that our existence at T + 120 billion humans is evidence that 240 billion total humans ever living is a lot more likely than 10^50 humans ever living, as in the former scenario our particular observation takes up a larger total proportion of the probability space[1].
In thinking about this question, I came up with 5 thought experiments that I think get progressively closer to accurately modeling the right way to think about this question. In these analogies, God creates some number of boxes (aka people who you might have wound up being), and depending on the way you frame her randomization/âbox-creation process, which of the above perspectives you should take seems to differ.
1. Box Before Universe
God tells you there are ten boxes. When you fall asleep, sheâll randomly put you in one of them. Then, sheâll flip a coin. If itâs tails, sheâll make another 999,999,999,990 boxes.
You wake up in box 3.
In this scenario, youâre clueless about Godâs coin toss, obviously. Whether or not she created the additional boxes, the odds of you being in box 3 are the same. If you donât think the universe is deterministic, you might think that this is a good analogy for the universe, as our relative position in humanityâs larger lifespan isnât âsetâ in any sense before weâre born. Perhaps the number of people of people who will ever exist is constantly changing, and weâre regularly affecting this number by making decisions. But if you agree with me that determinism is true (or you at least think itâs possible to predict things), then youâre probably similarly dissatisfied with this first thought experiment. There is and has always been some fact of the matter about how many people will ever exist, and so this analogy falls short.
2. Universe Before Box
God tells you sheâs going to flip a coin. If itâs heads, sheâll randomly put you in one of ten boxes numbered 1-10. If itâs tails, sheâll randomly put you in one of ten trillion boxes numbered 1-10,000,000,000,000. You wake up in box 3.
In this scenario, the odds of waking up in box 3 if Godâs coin landed heads are 1â10. The odds of waking up in box 3 if Godâs coin landed tails are 1â10,000,000,000,000. Therefore, by Bayes theorem, the odds of heads given that you woke up in box 3 are one-trillion-to-one. I think you should be confident the coin came up heads[2].
Similarly, assuming there is a fact-of-the-matter about how many people will exist, perhaps our âbox numberâ can justifiably inform our evaluation of competing claims about humanityâs lifespan. But this scenario doesnât really feel complete: there are clearly more possibilities than only 10 or 10 trillion people ever livingâwhat happens when we try to consider all the numbers? The following three thought experiments try to answer this.
3. Smiteful God 1.0
God tells you there are ten boxes. Sheâll randomly put you in one of them, but you wonât be able to see the number. She will smite box 1 on day 1, smite box 2 on day 2, and so on. On day 3, after God smites the third box, you find yourself still alive and wonder how many days you have left to live.
At this point, you could be in box 4, 5, 6, 7, 8, 9, or 10, meaning youâll live four more days on average. But the fact that three boxes have been smitten so far tells you nothing about which box youâre in (besides that you obviously werenât in those ones).
I included this scenario to acknowledge that knowing you arenât in other boxes doesnât work as evidence about which remaining box you are in, even in this problematic analogy where you do somehow know the total number of boxes that exist. But as the next two experiments will demonstrate, knowing you werenât one of these numbers is still useful evidence if you modify this experiment to incorporate the broader question about how many boxes (or people) to expect to be in a universe in the first place. And once we answer this question, determining our relative position within that universe is much easier.
4. Smiteful God 2.0
God tells you sheâs going to flip a coin. If itâs heads, sheâll randomly put you in one of ten boxes numbered 1-10. If itâs tails, sheâll randomly put you in one of ten trillion boxes numbered 1-10,000,000,000,000. In either case, you wonât be able to see the number. She will smite box 1 on day 1, smite box 2 on day 2, and so on. On day 9, after God smites the ninth box, you find yourself still alive and wonder how many days you have left to live.
The odds of surviving day 9 if Godâs coin landed heads are 1â10. The odds of surviving day 9 if Godâs coin landed tails are 9,999,999,999,999,991â10,000,000,000,000,000. Therefore, by Bayes theorem, the odds the coin landed heads given you survived day 9 are nearly ten-to-one. You should be quite confident the coin came up heads[3].
As in the second thought experiment, when you use your observed number to evaluate which of two possible universes you are in, it becomes much clearer that your observed number is relevant information. Finally: what happens if you extend this thinking to compare not two, but infinitely many possible hypotheses about how many boxes/âpeople might exist?
5. Smiteful God 3.0
God tells you sheâs going to pick some number N. Then sheâll create N boxes and put you in a random one of them. She will smite box 1 on day 1, box 2 on day 2, and blah blah blah you get the point. Three days pass and your existential dread is unbearable.
View one: You have no clue when youâre gonna die. All you know is that itâs 3 days less than whatever it was 3 days ago. On average, you will live (N â 3) /â 2 more days, and without knowing N you canât know shit. Even if God chose 4 and it was improbable that you would make it here, thereâs no way of knowing that, and you have no reason to suspect she chose to make 10 trillion boxes over 4 boxes or any other number. Just like in Smiteful God 1.0, the fact that three boxes have been smitten so far gives you no new information about which box youâre in.
View two: Suppose that instead of telling you she was going to choose any number, God said she was going to choose some power of 10 up to 10 trillion (could be 10, 100, 1,000, etc.). Then, you would have started with some complicated prior expectation about how many years you would live, but each time a box gets smited and you arenât in it, the odds she chose 10 decrease and the odds she chose something above your prior expectation increase, as in experiment 4. Thus, with each successive smite, your expectation about how much longer you have to live should increase.
This same logic would apply if God told you she was going to choose any individual number up to 10 trillion, or 10^100, or something way bigger. On view one, because you donât know N, updating it in any way feels unreasonable. But on view two, for any number God chose as the maximum value she could have generated, the rational response would be to update your expectation of Nâs value slightly upwards each time you wake up. In fact, God doesnât even have to tell you what the maximum number of boxes she might have chosen is: It need only be the case that there exists such a maximum number.
And so: The objection that we cannot update N since weâre completely clueless about it is actually an infinity-breaking-logic-type objection in disguise. As long as we accept that there is some fact-of-the-matter about how long human history could possibly be (e.g., it wonât be longer than the length of time from the big bang until the heat death of the universe), our prior expectation should be that we are not near the very beginning nor end of human existence. We should also increase our expectation of humanityâs total lifespan as time goes on.
This prior belief should of course be influenced by other evidence and Iâm not sure how resilient/âmalleable it should beâpossibly this sort of consideration is easily outweighed by our observations about technological progress, space travel, etc. But in any case, we should be (at least marginally) more receptive to the idea that humanity will end soon (which would mean weâre in the last 5% of people and is not so unreasonable on priors) than that there will be 10^50 more humans (which would mean weâre in the first 1/â10^38th [.000000000000000000000000000000000000001%] of people[4]).
The universe has already flipped the metaphorical coins to determine the total number of people who will exist. I hope I got lucky and Iâm merely one of the first human beings who will ever live- but I somewhat doubt thatâs the case.
Summary: I think itâs valid for our prior expectation to be that weâre in the middle of humanity. As long as we believe there is a true, finite answer to the question, âhow many people will ever exist,â we should theoretically be able to lay out all possible numbers of people who could exist and view them as competing hypotheses. Assuming weâre completely uncertain, we can assign each number an equal chance to start. Low numbers (120 billion being the minimum) look implausible simply because there are a ton of higher numbers the answer could be. Super high numbers also look implausible because the odds of drawing 120 billion from a bag with numbers 1 through N become increasingly slim as N approaches infinity. The answer which minimizes both kinds of surprise is N = 2X (where X is the number of people who have lived so far)[5], suggesting that we should assume weâre living in the middle of humanity by default.
By this logic, why is it not the case that 120 billion is more plausible than 240 billion, and we should assume weâre right at the end? 1/â120b is more likely than 1/â240b, right? The reason is that one also has to consider, âwhat are the odds that 120 billion people would have existed and I wouldnât have been one of them, given the total number of people who will ever live in the universe is N?â Given this consideration, 120 billion and 1 seems a lot less plausible than 240 billion. I think N = 2 times the number of humans who have lived before you optimally balances both of these considerations.
A note on the math here: I assume you assign a lower overall probability to p(box 3|tails) than to p(box 3|heads). This seems obviously correct to me. However, you might just reject all of this anthropic reasoning stuff and assign an equal probability to every possible scenario out of ignorance/âhumility or something. In this case, if you wake up in box 3, you donât gain any information about Godâs coin flip, and your credence the coin landed heads should still be 50%. Note, though, that this perspective also has the counterintuitive implication that before God put you in any box, it was more likely the coin would come up tails than heads, since there were more possible outcomes that could follow from the coin landing heads (if youâre confused, try assigning a probability to each possible outcome in this scenario and youâll see what I mean; see also criticisms of the halfer position in the sleeping beauty problem). Alternatively, you could just stubbornly insist on not needing to be consistent with how you assign probabilities to things. But I donât find that satisfying either (and I think I could decimate anyone with this belief in a casino). So in the remaining thought experiments I continue to assume the way I applied Bayes theorem here is valid.
I think some people might have an objection to experiment 4 that sounds like, âbut if you had been in one of boxes 1-9, you wouldnât be around anymore, and so you wouldnât be making this judgment!â This kind of argument is exceptionally hard to think about, but while I think it might be relevant to the question âwhich box are you in?â in this case, Iâm quite confident it isnât relevant to the question, âhow many boxes are in your universe?â I canât see how this consideration would affect the evidential value of knowing that, alas, we were not in boxes 1-9. Sorry for including the word alas, that was obnoxious.
Tbf, I think most people who say 10^50 people might live in the future are counting digital people, who account for a majority of the people-in-expectation. And in this case, theyâd probably endorse the view that weâre most likely not in the first 1/â10^38th of people; weâre actually probably in one of these future simulations (from the perspective of the early non-digital guys). I think this is consistent, just not what most people are assuming when they hear nerds say that 10^50 people will live in the future.
I want to insert math here later.
Thanks for this write up!
You might already be aware of these, but I think there are some strong objections to the doomsday argument that you didnât touch on in your post.
One is the Adam & Eve paradox, which seems to follow from the same logic as the Doomsday argument, but also seems completely absurd.
Another is reference class dependence. You say it is reasonable for me to conclude I am in the middle of âhumanityâ, but what is humanity? Why should I consider myself a sample from all homo sapiens, and not say, apes, or mammals? or earth-originating life? What even is a âhumanâ?
Executive summary: Our existence as roughly the 120 billionth human should inform our prior expectations about how many total humans will ever exist, making us skeptical of claims that humanity will end soon or that vastly more humans will exist in the future.
Key points:
The post presents five thought experiments exploring how our position in humanityâs timeline could provide evidence about the total number of humans who will ever exist.
In the âUniverse Before Boxâ experiment, waking up in box 3 out of 10 or 10 trillion total boxes provides strong evidence that there are only 10 total boxes.
In the âSmiteful God 2.0â experiment, surviving to day 9 when boxes are destroyed daily provides evidence that there are likely only 10 total boxes, not 10 trillion.
The âSmiteful God 3.0â experiment suggests that even without knowing the maximum possible number of boxes/âhumans, each day of survival should slightly increase our expectation of the total.
Assuming complete uncertainty, the hypothesis that minimizes surprise about our position as the 120 billionth human is that there will be roughly 240 billion humans total.
This anthropic reasoning is not conclusive but should inform our priors about humanityâs future, making us more skeptical of extinction soon or a vast future population.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.