I guess my prior coming into this is that non-existential catastrophes are still pretty existentially important, because:
they are bad in and of themselves
they are destabilising and make it more likely that we end up with existential catastrophes
I definitely wasn’t thinking explicitly about post-ASI catastrophes meaning we’d have to rerun the time of perils
But I was thinking about stuff like ‘a big war would probably set back AI development and could also make culture and selection pressures a fair bit worse, such that I feel worse about the outcome of AI development after that’. And similarly for bio
It sounds like your prior was that non-existential catastrophes are much much less important than existential ones, and then these considerations are a big update for you.
So I think part of why I’m less interested in this than you are is just having different priors where this update is fairly small/doesn’t change my prioritisation that much?
Yep, definitely for me ‘big civ setbacks are really bad’ was already baked in from the POV of setting bad context for pre-AGI-transition(s) (as well as their direct badness). But while I’d already agreed with Will about post-AGI not being an ‘end of history’ (in the sense that much remains uncertain re safety), I hadn’t thought through the implication that setbacks could force a rerun of the most perilous transition(s), which does add some extra concern.
I do think that non-existential level catastrophes are a big deal even despite the rerun risk consideration, because I expect the civilisation that comes back from such a catastrophe to be on a worse values trajectory than the one we have today. In particular, because the world today is unusually democratic and liberal, and I expect a re-roll of history to result in less democracy than we have today at the current technological level. However, other people have pushed me on that, and I don’t feel like the case here is very strong. There are also obvious reasons why one might be biassed towards having that view.
In contrast, the problem of having to rerun the time of perils is very crisp. It doesn’t seem to me like a disputable upshot at the moment, which puts it in a different category of consideration at least — one that everybody should be on board with.
I’m also genuinely unsure whether non-existential level catastrophe increases or decreases the chance of future existential level catastrophes. One argument that people have made that I don’t put that much stock in is that future generations after the catastrophe would remember it and therefore be more likely to take action to reduce future catastrophes. I don’t find that compelling because I don’t think that the Spanish flu made us more prepared against Covid-19, for example. Let alone that the plagues of Justinian prepared us against Covid-19. However, I’m not seeing other strong arguments in this vein, either.
“I don’t think that the Spanish flu made us more prepared against Covid-19” actually I’m betting our response to Covid-19 was better than it would have been without having had major pandemics in the past. For example, the response involved developing effective vaccines very quickly
I guess my prior coming into this is that non-existential catastrophes are still pretty existentially important, because:
they are bad in and of themselves
they are destabilising and make it more likely that we end up with existential catastrophes
I definitely wasn’t thinking explicitly about post-ASI catastrophes meaning we’d have to rerun the time of perils
But I was thinking about stuff like ‘a big war would probably set back AI development and could also make culture and selection pressures a fair bit worse, such that I feel worse about the outcome of AI development after that’. And similarly for bio
It sounds like your prior was that non-existential catastrophes are much much less important than existential ones, and then these considerations are a big update for you.
So I think part of why I’m less interested in this than you are is just having different priors where this update is fairly small/doesn’t change my prioritisation that much?
Yep, definitely for me ‘big civ setbacks are really bad’ was already baked in from the POV of setting bad context for pre-AGI-transition(s) (as well as their direct badness). But while I’d already agreed with Will about post-AGI not being an ‘end of history’ (in the sense that much remains uncertain re safety), I hadn’t thought through the implication that setbacks could force a rerun of the most perilous transition(s), which does add some extra concern.
I do think that non-existential level catastrophes are a big deal even despite the rerun risk consideration, because I expect the civilisation that comes back from such a catastrophe to be on a worse values trajectory than the one we have today. In particular, because the world today is unusually democratic and liberal, and I expect a re-roll of history to result in less democracy than we have today at the current technological level. However, other people have pushed me on that, and I don’t feel like the case here is very strong. There are also obvious reasons why one might be biassed towards having that view.
In contrast, the problem of having to rerun the time of perils is very crisp. It doesn’t seem to me like a disputable upshot at the moment, which puts it in a different category of consideration at least — one that everybody should be on board with.
I’m also genuinely unsure whether non-existential level catastrophe increases or decreases the chance of future existential level catastrophes. One argument that people have made that I don’t put that much stock in is that future generations after the catastrophe would remember it and therefore be more likely to take action to reduce future catastrophes. I don’t find that compelling because I don’t think that the Spanish flu made us more prepared against Covid-19, for example. Let alone that the plagues of Justinian prepared us against Covid-19. However, I’m not seeing other strong arguments in this vein, either.
“I don’t think that the Spanish flu made us more prepared against Covid-19” actually I’m betting our response to Covid-19 was better than it would have been without having had major pandemics in the past. For example, the response involved developing effective vaccines very quickly