I guess my prior coming into this is that non-existential catastrophes are still pretty existentially important, because:
they are bad in and of themselves
they are destabilising and make it more likely that we end up with existential catastrophes
I definitely wasn’t thinking explicitly about post-ASI catastrophes meaning we’d have to rerun the time of perils
But I was thinking about stuff like ‘a big war would probably set back AI development and could also make culture and selection pressures a fair bit worse, such that I feel worse about the outcome of AI development after that’. And similarly for bio
It sounds like your prior was that non-existential catastrophes are much much less important than existential ones, and then these considerations are a big update for you.
So I think part of why I’m less interested in this than you are is just having different priors where this update is fairly small/doesn’t change my prioritisation that much?
Fixed, sorry!