Impact: AI causes the extinction of people in the next 1000 years.
Why is this a priority? Extinction events are very bad from the point of view of people who want the future to be big and utopian. The 1000-year time frame (I think) is long enough to accommodate most timelines for very advanced AI, but short enough that we don’t have to worry about “a butterfly flaps its wings and 10 million years later everyone is dead” type scenarios. While it is speculative, it does not seem reasonable given what we know right now to assign this event vanishingly low probability. Finally, my impression is that while it is taken seriously in and near the EA community, it is largely not taken seriously outside the community commensurate with reasonable estimates of subjective likelihood and severity.
Impact: AI causes the extinction of people in the next 1000 years.
Why is this a priority? Extinction events are very bad from the point of view of people who want the future to be big and utopian. The 1000-year time frame (I think) is long enough to accommodate most timelines for very advanced AI, but short enough that we don’t have to worry about “a butterfly flaps its wings and 10 million years later everyone is dead” type scenarios. While it is speculative, it does not seem reasonable given what we know right now to assign this event vanishingly low probability. Finally, my impression is that while it is taken seriously in and near the EA community, it is largely not taken seriously outside the community commensurate with reasonable estimates of subjective likelihood and severity.