St. Petersburg Demon – a thought experiment that makes me doubt Longtermism

Epistemic status: Just a thought that I have, nothing too rigorous

The reason Longtermism is so enticing (to me at least), is that the existence of so many future life hangs in the balance right now. It just seems to be a pretty good deed to me, to bring 10^52 people (or whatever the real number will turn out to be) into existence.

This hinges on the belief that Utility scales linearly with the number of QUALYs, so that twice as many people are also twice as morally valuable. My belief in this was recently shaken by this thought experiment:

***

You are a traveling EA on a trip to St. Petersburg. In a dark alley, you meet a Demon with the ability to create Universes and a serious gambling addiction. He says, he was about to create a universe with 10 happy people. But he gives you three fair dice and offers you a bet: You can throw the three dice and if they all come up 6, he refrains from creating a universe. If you roll anything else, he will double the number of people in the universe he will create.

You do the expected value calculation and figure out, that by throwing the dice you will create 696,8 QUALYs in expectation. You take the bet and congratulate yourself on your ethical decision.

After the good deed is done, and the demon has now committed to creating 20 happy people, he offers you the same bet again. Roll the 3 dice: he won’t create a universe at 6,6,6 and doubles it at anything else. The demon tells you that he will offer you the same bet repeatedly. You do your calculations and throw the dice again and again, until, eventually, you throw all sixes, and the demon vanishes, without having to create any universe, in a cloud of sulfury mist and leaves you wondering if you should have done anything differently.

***

There are a few ways to weasel out of the demon’s bet. You could say, that the strategy “always take the demons bet” has an expected value of 0 QUALYs, and so you should go with some tactic like “Take the first 20 bets, then call it a day”. But I think if you refuse a bet, you should be able to reject this bet without taking into account what bets you have taken in the past or are still taking in the future.

I think the only consistent way to refuse the Demons bets at some point is to have a bounded utility function. You might think it would be enough to have a utility function that does not scale linearly with the number of QUALYs, but logarithmically or something. But in that case, the demon can offer to double the amount of utility, instead of doubling the amount of QUALYs, and we are back in the paradox. At some point, you have to be able to say: “There is no possible universe that is twice as good as the one, you have promised me already”. So at some point, adding more happy people to the universe must have a negligible ethical effect. And once we accept that that must happen at some point, how confident are we, that 10^52 people are that much better than 8billion?

Overall I am still pretty confused about this subject and would love to hear more arguments/​perspectives.