When I model the existence of simulations like us, SIA does not imply doom (as seen in the marginalised posteriors for fGC in the appendix here).
It does imply doom for us, since we’re almost certainly in a short-lived simulation.
And if we condition on being outside of a simulation, SIA also implies doom for us, since it’s more likely that we’ll find ourselves outside of a simulation if there are more basement-level civilizations, which is facilitated by more of them being doomed.
It just implies that there weren’t necessarily a lot of doomed civilizations in the basement-level universe, many basement-level years ago, when our simulators were a young civilization.
(1) maybe doom should be disambiguated between “the short-lived simulation that I am in is turned of”-doom (which I can’t really observe) and “the basement reality Earth I am in is turned into paperclips by an unaligned AGI”-type doom.
(2) conditioning on me being in at least one short-lived simulation, if the multiverse is sufficiently large and the simulation containing me is sufficiently ‘lawful’ then I may also expect there to be basement reality copies of me too. In this case, doom is implied for (what I would guess is) most exact copies of me.
(1) maybe doom should be disambiguated between “the short-lived simulation that I am in is turned of”-doom (which I can’t really observe) and “the basement reality Earth I am in is turned into paperclips by an unaligned AGI”-type doom.
Yup, I agree the disambiguation is good. In aliens-context, it’s even useful to disambiguate those types of doom from “Intelligence never leaves the basement reality Earth I am on”-doom. Since paperclippers probably would become grabby.
It does imply doom for us, since we’re almost certainly in a short-lived simulation.
And if we condition on being outside of a simulation, SIA also implies doom for us, since it’s more likely that we’ll find ourselves outside of a simulation if there are more basement-level civilizations, which is facilitated by more of them being doomed.
It just implies that there weren’t necessarily a lot of doomed civilizations in the basement-level universe, many basement-level years ago, when our simulators were a young civilization.
I agree with what you say, though would note
(1) maybe doom should be disambiguated between “the short-lived simulation that I am in is turned of”-doom (which I can’t really observe) and “the basement reality Earth I am in is turned into paperclips by an unaligned AGI”-type doom.
(2) conditioning on me being in at least one short-lived simulation, if the multiverse is sufficiently large and the simulation containing me is sufficiently ‘lawful’ then I may also expect there to be basement reality copies of me too. In this case, doom is implied for (what I would guess is) most exact copies of me.
Yup, I agree the disambiguation is good. In aliens-context, it’s even useful to disambiguate those types of doom from “Intelligence never leaves the basement reality Earth I am on”-doom. Since paperclippers probably would become grabby.