I think what we should be talking about is whether we hit the “point of no return” this century for extinction of Earth-originating intelligent life. Where that could mean: Homo sapiens and a most other mammals get killed off in an extinction event this century; then technologically-capable intelligence never evolves again on Earth; so all life dies off within a billion years or so. (In a draft post that you saw of mine, this is what I had in mind.)
The probability of this might be reasonably high. There I’m at idk 1%-5%.
I think what we should be talking about is whether we hit the “point of no return” this century for extinction of Earth-originating intelligent life. Where that could mean: Homo sapiens and a most other mammals get killed off in an extinction event this century; then technologically-capable intelligence never evolves again on Earth; so all life dies off within a billion years or so. (In a draft post that you saw of mine, this is what I had in mind.)
The probability of this might be reasonably high. There I’m at idk 1%-5%.
Notably, the extinction event in this scenario is non-AI related I assume? And needs to occur before we have created self-sufficient AIs.