On symmetry between options in the tails, if you think there’s no upper bound with certainty on how long our descendants could last, then reducing extinction risk could have unbounded effects. Maybe other x-risks, too. I do think heavy tails like this are very unlikely, but it’s hard to justifiably rule them out with certainty.
Or, you could have a heavy tail on the number of non-solipsist simulations, or the number of universes in our multiverse (if spatially very large, or the number of quantum branches, or the number of pocket universes, or if the universe will start over many times, like a Big Bounce, etc.), and acausal influence over what happens in them.
Thanks for engaging!
On symmetry between options in the tails, if you think there’s no upper bound with certainty on how long our descendants could last, then reducing extinction risk could have unbounded effects. Maybe other x-risks, too. I do think heavy tails like this are very unlikely, but it’s hard to justifiably rule them out with certainty.
Or, you could have a heavy tail on the number of non-solipsist simulations, or the number of universes in our multiverse (if spatially very large, or the number of quantum branches, or the number of pocket universes, or if the universe will start over many times, like a Big Bounce, etc.), and acausal influence over what happens in them.