Barring simulation shutdown sorts of things or divine intervention I think more like 1 in 1 million per century, on the order of magnitude of encounters with alien civilizations. Simulation shutdown is a hole in the argument that we could attain such a state, and I think a good reason not to say things like ‘the future is in expectation 50 orders of magnitude more important than the present.’
Whether simulation shutdown is a good reason not to say such things would seem to depend on how you model the possibility of simulation shutdown.
One naive model would say that there is a 1/n chance that the argument that such a risk exists is correct, and if so, there is 1/m annual risk, otherwise, there is 0 annual risk from simulation shutdown. In such a model, the value of the future endowment would only be decreased n-fold. Whereas if you thought that there was definitely a 1/(mn) annual risk (i.e. the annual risk is IID) then that risk would diminish the value of the cosmic endowment by many OoM.
Simulation shutdown would end value in our reality, but in that of the simulators it would presumably continue, such that future expected value would be greater than the suggested by an extinction risk of 10^-6 per century? On the other hand, even if this is true, it would not matter because it would be quite hard to influence the eventual simulators?
What would be the per-century risk in such a state?
Also, does the >50% of such a state account for the possibility of alien civilizations destroying us or otherwise limiting our expansion?
Barring simulation shutdown sorts of things or divine intervention I think more like 1 in 1 million per century, on the order of magnitude of encounters with alien civilizations. Simulation shutdown is a hole in the argument that we could attain such a state, and I think a good reason not to say things like ‘the future is in expectation 50 orders of magnitude more important than the present.’
Whether simulation shutdown is a good reason not to say such things would seem to depend on how you model the possibility of simulation shutdown.
One naive model would say that there is a 1/n chance that the argument that such a risk exists is correct, and if so, there is 1/m annual risk, otherwise, there is 0 annual risk from simulation shutdown. In such a model, the value of the future endowment would only be decreased n-fold. Whereas if you thought that there was definitely a 1/(mn) annual risk (i.e. the annual risk is IID) then that risk would diminish the value of the cosmic endowment by many OoM.
I’d use reasoning like this, so simulation concerns don’t have to be ~certain to drastically reduce EV gaps between local and future oriented actions.
What is the source of the estimate of the frequency of encounters with alien civilizations?
Here’s a good piece.
Hi Carl,
Simulation shutdown would end value in our reality, but in that of the simulators it would presumably continue, such that future expected value would be greater than the suggested by an extinction risk of 10^-6 per century? On the other hand, even if this is true, it would not matter because it would be quite hard to influence the eventual simulators?
You could have acausal influence over others outside the simulation you find yourself in, perhaps especially others very similar to you in other simulations. See also https://longtermrisk.org/how-the-simulation-argument-dampens-future-fanaticism
Thanks for sharing that article, Michael!
On the topic of acausal influence, I liked this post, and this clarifying comment.