I agree that our civilization is unstable, and climate change, nuclear war, and resource exhaustion are certainly important risks to be considered and mitigated.
With that said, societal collapse—while certainly bad—is not extinction. Resource exhaustion and nuclear war won’t drive us to extinction, and even climate change would have a hard time killing us all (in the absence of other catastrophes, which is certainly not guaranteed).
Humans have recovered from societal collapses several times in the past, so you would have to make some argument as to why this couldn’t happen again should the same thing happen to Western techno-capitalist society.
As an example, if the P(collapse) given AGI is never achieved is 1, it would still be a preferable outcome to pursue versus creating AGI with P(extinction) of > .05 (this probability as cited in the recent AI expert survey). I’m willing to accept a very high level of s-risk(s) to avoid an x-risk with a sufficiently high probability of occurrence, because extinction would be a uniquely tragic event.
I agree that our civilization is unstable, and climate change, nuclear war, and resource exhaustion are certainly important risks to be considered and mitigated.
With that said, societal collapse—while certainly bad—is not extinction. Resource exhaustion and nuclear war won’t drive us to extinction, and even climate change would have a hard time killing us all (in the absence of other catastrophes, which is certainly not guaranteed).
Humans have recovered from societal collapses several times in the past, so you would have to make some argument as to why this couldn’t happen again should the same thing happen to Western techno-capitalist society.
As an example, if the P(collapse) given AGI is never achieved is 1, it would still be a preferable outcome to pursue versus creating AGI with P(extinction) of > .05 (this probability as cited in the recent AI expert survey). I’m willing to accept a very high level of s-risk(s) to avoid an x-risk with a sufficiently high probability of occurrence, because extinction would be a uniquely tragic event.