I am using apocalyptic broadly, such that it applies both to the existential risk and to global catastrophic risk. IMPO if a nuclear war kills 99% of the population it is still apocalyptic.
I think that distinguishing between global catastrophic risk and existential risk is extremely difficult. While climate scientists don’t generally predict human extinction I think this is largely because of pressure to not appear alarmist and due to state and corporate interests to downplay and ignore climate change. On the other hand, it doesn’t seem like theirs any particular forces working to downplay AI risk.
I am using apocalyptic broadly, such that it applies both to the existential risk and to global catastrophic risk. IMPO if a nuclear war kills 99% of the population it is still apocalyptic.
I think that distinguishing between global catastrophic risk and existential risk is extremely difficult. While climate scientists don’t generally predict human extinction I think this is largely because of pressure to not appear alarmist and due to state and corporate interests to downplay and ignore climate change. On the other hand, it doesn’t seem like theirs any particular forces working to downplay AI risk.