So “existential catastrophe” probably shouldn’t just mean “human extinction”. But then it surprisingly slippery as a concept. Existential risk is the risk of existential catastrophe, but it’s difficult to give a neat and intuitive definition of “existential catastrophe” such that “minimise existential catastrophe” is a very strong guide for how to do good. Hilary Greaves dicusses candidate definitions here.
Tooting my own trumpet, I did a lot of work on improving the question x-riskers are asking in this sequence.
Tooting my own trumpet, I did a lot of work on improving the question x-riskers are asking in this sequence.