Existential risk from AI is not just existential risk to Earth. It is existential risk to our entire future light cone, including all the aliens in it! A misaligned superintelligent AI would not stop at paperclipping the Earth.
Although I guess maybe aliens could be filed under ‘miracle outs’ for our current predicament with AI x-risk.
Existential risk from AI is not just existential risk to Earth. It is existential risk to our entire future light cone, including all the aliens in it! A misaligned superintelligent AI would not stop at paperclipping the Earth.
Although I guess maybe aliens could be filed under ‘miracle outs’ for our current predicament with AI x-risk.