and a probability of 1 % of extinction being an existential catastrophe
I think you should probably have a higher probability on some unknown filter making it less likely that intelligent civilization re-evolves. (Given anthropics.)
I’d say 20% chance that intelligent life doesn’t re-evolve on earth due to this mechanism.
There are also potentially aliens, which is perhaps a factor of 2 getting me to 10% chance of no group capable of using resources conditional on literal extinction of all intelligent civilization on earth. (Which is 10x higher than your estimate.)
I also think that I’d prefer human control than the next evolved life and than aliens by a moderate amount due to similarity of values arguments.
I’ve now updated toward a higher chance life re-evolves and a lower chance on some unknown filter because we can see that the primate to intelligent civilization time gap is quite small.
That makes sense. It looks like humans branched off chimpanzees just 5.5 M years (= (5 + 6)/2*10^6) ago. Assuming the time from chimpanzees to a species similar to humans follows an exponential distribution with a mean equal to that time, the probability of not recovering after human extinction in the 1 billion years during which Earth will remain habitable would be only 1.09*10^-79 (= e^(-10^9/(5.5*10^6))). The probability of not recovering is higher due to model uncertainty. The time to recover may follow a different distribution.
In addition, recovery can be harder for other risks:
Catastrophes wiping out more species in humans’ evolutionary past (e.g. the impact of a large comet) would have a longer expected recovery time, and therefore imply a lower chance of recovery during the time Earth will remain habitable.
The estimates I provided in my comment were mainly illustrative. However, my 1 % chance of existential catastrophe conditional on human extinction was coming from my expectation that humans will be on board with going extinct in the vast majority of worlds where they go extinct in the next few centuries because their AI or posthuman descendents would live on.
I think you should probably have a higher probability on some unknown filter making it less likely that intelligent civilization re-evolves. (Given anthropics.)
I’d say 20% chance that intelligent life doesn’t re-evolve on earth due to this mechanism.
There are also potentially aliens, which is perhaps a factor of 2 getting me to 10% chance of no group capable of using resources conditional on literal extinction of all intelligent civilization on earth. (Which is 10x higher than your estimate.)
I also think that I’d prefer human control than the next evolved life and than aliens by a moderate amount due to similarity of values arguments.
I’ve now updated toward a higher chance life re-evolves and a lower chance on some unknown filter because we can see that the primate to intelligent civilization time gap is quite small.
That makes sense. It looks like humans branched off chimpanzees just 5.5 M years (= (5 + 6)/2*10^6) ago. Assuming the time from chimpanzees to a species similar to humans follows an exponential distribution with a mean equal to that time, the probability of not recovering after human extinction in the 1 billion years during which Earth will remain habitable would be only 1.09*10^-79 (= e^(-10^9/(5.5*10^6))). The probability of not recovering is higher due to model uncertainty. The time to recover may follow a different distribution.
In addition, recovery can be harder for other risks:
Catastrophes wiping out more species in humans’ evolutionary past (e.g. the impact of a large comet) would have a longer expected recovery time, and therefore imply a lower chance of recovery during the time Earth will remain habitable.
As I said above, I estimated a 0.0513 % chance of not fully recovering from a repetition of the last mass extinction 66 M years ago, the Cretaceous–Paleogene extinction event.
A rogue AI would not allow another species to take control.
The estimates I provided in my comment were mainly illustrative. However, my 1 % chance of existential catastrophe conditional on human extinction was coming from my expectation that humans will be on board with going extinct in the vast majority of worlds where they go extinct in the next few centuries because their AI or posthuman descendents would live on.