[Question] How would you define “existential risk?”

The classic definition comes from Bostrom:

Existential risk – One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.

But this definition, while poetic and gesturing at something real, is more than a bit vague, and many people are unhappy with it, judging from the long chain of clarifying questions in my linked question. So I’m interested in proposed community alternatives that the EA community and/​or leading longtermist or xrisk researchers may wish to adopt instead.

Alternative definitions should ideally be precise, clear, unambiguous, and hopefully not too long.

No comments.