I agree that an “incoherent superintelligence” does not sound very reassuring. Imagine someone saying this:
I’m not too worried about advanced AI. I think it will be a superintelligent hot mess. By this I mean an extremely powerful machine that has various conflicting goals. What could possibly go wrong?
I agree that an “incoherent superintelligence” does not sound very reassuring. Imagine someone saying this: