Is there any specific reason for discounting the possibility that arthropods or reptiles evolving over millions of years to something that equals or surpasses the intelligence of humans that were last alive?
No, I think analysis shouldn’t discount this. Unless there is an unknown hard-to-pass point (a filter) between existing mammals/primates and human level civilization, it seems like life re-evolving is quite likely. (I’d say 85% chance of a new civilization conditional on human extinction, but not primate extinction, and 75% if primates also go extinct.)
There is also the potential for alien civilizations, though I think this has a lower probability (perhaps 50% that aliens capture >75% of the cosmic resources in our light cone if earth originating civilizations don’t caputure these resources).
IMO, the dominant effect of extinction due to bio-risk is that a different earth originating species acquires power and my values on reflection are likely to be closer to humanities values on reflection than the other species. (I also have some influence over how humanity spends its resources, though I expect this effect is not that big.)
If you were equally happy with other species, then I think you still only take a 10x discount from these considerations because there is some possibility of a hard-to-pass barrier between other life and humans. 10x discounts don’t usually seem like cruxes IMO.
I would also note that for AI x-risk, life intelligent life reevolving is unimportant. (I also think AI x-risk is unlikely to result in extinction because AIs are unlikely to want to kill all humans for various reasons.)
And over time scales of billions, we could enter the possibility of evolution from basic eukaryotes too.
Earth will be habitable for about ~1 billion more years which probably isn’t quite enough for this.
No, I think analysis shouldn’t discount this. Unless there is an unknown hard-to-pass point (a filter) between existing mammals/primates and human level civilization, it seems like life re-evolving is quite likely. (I’d say 85% chance of a new civilization conditional on human extinction, but not primate extinction, and 75% if primates also go extinct.)
There is also the potential for alien civilizations, though I think this has a lower probability (perhaps 50% that aliens capture >75% of the cosmic resources in our light cone if earth originating civilizations don’t caputure these resources).
IMO, the dominant effect of extinction due to bio-risk is that a different earth originating species acquires power and my values on reflection are likely to be closer to humanities values on reflection than the other species. (I also have some influence over how humanity spends its resources, though I expect this effect is not that big.)
If you were equally happy with other species, then I think you still only take a 10x discount from these considerations because there is some possibility of a hard-to-pass barrier between other life and humans. 10x discounts don’t usually seem like cruxes IMO.
I would also note that for AI x-risk, life intelligent life reevolving is unimportant. (I also think AI x-risk is unlikely to result in extinction because AIs are unlikely to want to kill all humans for various reasons.)
Earth will be habitable for about ~1 billion more years which probably isn’t quite enough for this.