What exactly is the alternative action that has vastly greater value in expectation, and why does it have greater value?
Ensuring human control throughout the singularity rather than having AIs get control very obviously has relatively massive effects. Of course, we can debate the sign here, I’m just making a claim about the magnitude.
I’m not talking about extinction of all smart beings on earth (AIs and humans), which seems like a small fraction of existential risk.
(Separately, the badness of such extinction seems maybe somewhat overrated because pretty likely intelligent life will just re-evolve in the next 300 million years. Intelligent life doesn’t seem that contingent. Also aliens.)
Ensuring human control throughout the singularity rather than having AIs get control very obviously has relatively massive effects. Of course, we can debate the sign here, I’m just making a claim about the magnitude.
I’m not talking about extinction of all smart beings on earth (AIs and humans), which seems like a small fraction of existential risk.
(Separately, the badness of such extinction seems maybe somewhat overrated because pretty likely intelligent life will just re-evolve in the next 300 million years. Intelligent life doesn’t seem that contingent. Also aliens.)
For what it’s worth, I think my reply to Pablo here responds to your comment fairly adequately too.