The next existential catastrophe is likelier than not to wipe off all animal sentience from the planet
Not even the great dying got everything, so of the known natural (defined as coming from nature not technology) risks such as asteroid impact, climate change, etc. I don’t give a better than 50% weight to them “wiping off all animal sentience”. Nuclear weapons… we don’t have enough to saturate the planet to the needed level; so they are also below 50%. That only leaves AI, and while I have a higher than 50% chance AI takes out all humanity, I suspect rather a lot of intelligent animals will get through it. At the risk of being the chimp that thinks it is safe from humans up in the tree because it’s not smart enough to understand guns and helicopters, it seems that even for a paperclip optimizer, at nearly every point in time, it will be more efficient/optimal to go out into space and get resources to make more paperclips than to dive into more remote parts of the ocean or expand into more remote environments on land. There are many remote islands and tribes and animals that are in places where it just seems impractical to be looking for resources compared to the moon, asteroids, the orbital clutter, etc. At what point is the extremely small amount of resources on pitcarin island, for example, worth harvesting vs the cost? Return on Investment seems like something an optimizer would care about, and I think that would get the most remote locations significant time, especially as the optimizer’s capabilities scaled up more and more making much larger resource deposits accessible. Eventually, maybe, it will get around to hunting down every last atom, but is that still the same event or another one? I am not sure. This makes me move my estimate below 50% for the extreme claim “all animal sentience”. “most” or “over 98% of animal sentience” I would definitely be above 50% likely. “all” is a very extreme qualifier.
Not even the great dying got everything, so of the known natural (defined as coming from nature not technology) risks such as asteroid impact, climate change, etc. I don’t give a better than 50% weight to them “wiping off all animal sentience”. Nuclear weapons… we don’t have enough to saturate the planet to the needed level; so they are also below 50%. That only leaves AI, and while I have a higher than 50% chance AI takes out all humanity, I suspect rather a lot of intelligent animals will get through it. At the risk of being the chimp that thinks it is safe from humans up in the tree because it’s not smart enough to understand guns and helicopters, it seems that even for a paperclip optimizer, at nearly every point in time, it will be more efficient/optimal to go out into space and get resources to make more paperclips than to dive into more remote parts of the ocean or expand into more remote environments on land. There are many remote islands and tribes and animals that are in places where it just seems impractical to be looking for resources compared to the moon, asteroids, the orbital clutter, etc. At what point is the extremely small amount of resources on pitcarin island, for example, worth harvesting vs the cost? Return on Investment seems like something an optimizer would care about, and I think that would get the most remote locations significant time, especially as the optimizer’s capabilities scaled up more and more making much larger resource deposits accessible. Eventually, maybe, it will get around to hunting down every last atom, but is that still the same event or another one? I am not sure. This makes me move my estimate below 50% for the extreme claim “all animal sentience”. “most” or “over 98% of animal sentience” I would definitely be above 50% likely. “all” is a very extreme qualifier.