Some of those definitions definitely do, and some probablg do, so yes!
However, I think there is a valid question is studying S-Risks gains much from being part of the ERS community or whether it would be more beneficial to be its own thing. I’m unsure (genuinely) how much either side gains from its involvement (maybe a lot!)
In general, to me the two areas that may be commonly associated with xrisk that I don’t know how useful it is for it to be fully in ERS (although in conversation with definitely helps) is pure technical alignment (when divorced from AI Strategy) and maybe srisks, but I’m pretty unsure of this take, and many signatories would probably disagree.
Some of those definitions definitely do, and some probablg do, so yes! However, I think there is a valid question is studying S-Risks gains much from being part of the ERS community or whether it would be more beneficial to be its own thing. I’m unsure (genuinely) how much either side gains from its involvement (maybe a lot!) In general, to me the two areas that may be commonly associated with xrisk that I don’t know how useful it is for it to be fully in ERS (although in conversation with definitely helps) is pure technical alignment (when divorced from AI Strategy) and maybe srisks, but I’m pretty unsure of this take, and many signatories would probably disagree.
Good to know, thanks!