Since you emphasize diversity, I wanted to ask whether (or to what extent) this paragraph is meant to include or exclude s-risks:
Existential risk, as a concept, is defined in a multitude of ways in our community. Some in this group think it to be ‘risks of human extinction, societal collapse and other events associated with these’[1], others conceptions involve ‘risk of permanent loss of humanity’s potential’[2] or the risk of the loss of large amounts of expected value of the future[3]. Some see it as an inseparable part of the broader class of risks to the existence of individuals, communities or specific ‘worlds’[4].
Some of those definitions definitely do, and some probablg do, so yes!
However, I think there is a valid question is studying S-Risks gains much from being part of the ERS community or whether it would be more beneficial to be its own thing. I’m unsure (genuinely) how much either side gains from its involvement (maybe a lot!)
In general, to me the two areas that may be commonly associated with xrisk that I don’t know how useful it is for it to be fully in ERS (although in conversation with definitely helps) is pure technical alignment (when divorced from AI Strategy) and maybe srisks, but I’m pretty unsure of this take, and many signatories would probably disagree.
Since you emphasize diversity, I wanted to ask whether (or to what extent) this paragraph is meant to include or exclude s-risks:
Some of those definitions definitely do, and some probablg do, so yes! However, I think there is a valid question is studying S-Risks gains much from being part of the ERS community or whether it would be more beneficial to be its own thing. I’m unsure (genuinely) how much either side gains from its involvement (maybe a lot!) In general, to me the two areas that may be commonly associated with xrisk that I don’t know how useful it is for it to be fully in ERS (although in conversation with definitely helps) is pure technical alignment (when divorced from AI Strategy) and maybe srisks, but I’m pretty unsure of this take, and many signatories would probably disagree.
Good to know, thanks!