Nice. Well, I guess we just have different intuitions then—for me, the chance of extinction or worse in the Octopcracy case seems a lot bigger than “small but non-negligible” (though I also wouldn’t put it as high as 99%).
Human groups struggle against each other for influence/power/control constantly; why wouldn’t these octopi (or AIs) also seek influence? You don’t need to be an expected utility maximizer to instrumentally converge; humans instrumentally converge all the time.
Nice. Well, I guess we just have different intuitions then—for me, the chance of extinction or worse in the Octopcracy case seems a lot bigger than “small but non-negligible” (though I also wouldn’t put it as high as 99%).
Human groups struggle against each other for influence/power/control constantly; why wouldn’t these octopi (or AIs) also seek influence? You don’t need to be an expected utility maximizer to instrumentally converge; humans instrumentally converge all the time.