I think this is a valid concern. Separately, it’s not clear that all s-risks are x-risks, depending on how “astronomical suffering” and “human potential” are understood.
What do you think about the concept of a hellish existential catastrophe? It highlights both that (some) s-risks fall under the category of existential risk and that they have an additional important property absent from typical x-risks. The concept isolates a risk the reduction of which should arguably be prioritized by EAs with different moral perspectives.
I think this is a valid concern. Separately, it’s not clear that all s-risks are x-risks, depending on how “astronomical suffering” and “human potential” are understood.
What do you think about the concept of a hellish existential catastrophe? It highlights both that (some) s-risks fall under the category of existential risk and that they have an additional important property absent from typical x-risks. The concept isolates a risk the reduction of which should arguably be prioritized by EAs with different moral perspectives.