A fourteenth theory you should consider adding to your list is the possibility of AGI leading to S-risks. This could be considered similar to #3, but astronomical suffering has the potential to be far worse than extinction. One possible way this could come about is through a “near miss” in AI alignment.
Fair point! I think I’ll instead just encourage people to read the comments. Ideally, more elements will be introduced over time, and I don’t want to have to keep on updating the list (and the title).
A fourteenth theory you should consider adding to your list is the possibility of AGI leading to S-risks. This could be considered similar to #3, but astronomical suffering has the potential to be far worse than extinction. One possible way this could come about is through a “near miss” in AI alignment.
Fair point! I think I’ll instead just encourage people to read the comments. Ideally, more elements will be introduced over time, and I don’t want to have to keep on updating the list (and the title).
Here’s a chart I found of existential risks that includes S-risks.