Note that the comment you’re replying to says “take over the world” not extinction.
I think extinction is unlikely conditional on takeover (and takeover seems reasonably likely).
Neanderthal take over doesn’t seem very bad from my perspective, so probably I’m basically fine with that. (Particularly if we ensure that some basic ideas are floating around in Neanderthal culture like “maybe you should be really thoughtful and careful with what you do with the cosmic endowment”.)
Note that the comment you’re replying to says “take over the world” not extinction.
I think extinction is unlikely conditional on takeover (and takeover seems reasonably likely).
Neanderthal take over doesn’t seem very bad from my perspective, so probably I’m basically fine with that. (Particularly if we ensure that some basic ideas are floating around in Neanderthal culture like “maybe you should be really thoughtful and careful with what you do with the cosmic endowment”.)
I agree, but the original comment said “In particular, I’m interested in accounts of the “how” of AI extinction”.