I personally do think the probability of eventual disempowerment is high. However, you are implying that it is 100%. If it is 99%, or indeed even 99.9999999%, and one thinks the value of the future is significantly higher with humanity (not necessarily biological humans) in control vs AI, then there are still astronomical stakes of humanity remaining in control.
I personally do think the probability of eventual disempowerment is high. However, you are implying that it is 100%. If it is 99%, or indeed even 99.9999999%, and one thinks the value of the future is significantly higher with humanity (not necessarily biological humans) in control vs AI, then there are still astronomical stakes of humanity remaining in control.