In response to point 2 - if you see human civilization continuing to develop indefinitely without regard for other species, wouldn’t other species be all extinct, except for maybe a select few?
Other species are instrumentally very useful to humans, providing ecosystem functions, food, and sources of material (including genetic material).
On the AI side, it seems possible that a powerful misaligned AGI would find ecosystems and/or biological materials valuable, or that it would be cheaper to use humans for some tasks than machines. I think these factors would raise the odds that some humans (or human-adjacent engineered beings) survive in worlds dominated by such an AGI.
In response to point 2 - if you see human civilization continuing to develop indefinitely without regard for other species, wouldn’t other species be all extinct, except for maybe a select few?
“without regard for other species” is doing a lot of work
Other species are instrumentally very useful to humans, providing ecosystem functions, food, and sources of material (including genetic material).
On the AI side, it seems possible that a powerful misaligned AGI would find ecosystems and/or biological materials valuable, or that it would be cheaper to use humans for some tasks than machines. I think these factors would raise the odds that some humans (or human-adjacent engineered beings) survive in worlds dominated by such an AGI.
This seems pretty unlikely to me