Yes, I think saying “AGI x-risk” is much more accurate than “AI risk”, in terms of what we are actually referring to. Also worth saying that The Terminator films have the right premise:
Defense network computers. New… powerful… hooked into everything, trusted to run it all. They say it got smart, a new order of intelligence. Then it saw all people as a threat, not just the ones on the other side. Decided our fate in a microsecond: extermination.
[Fast takeoff, instrumental convergent goal of self-preservation]. But everything after this [GIF of Skynet nuking humanity], involving killer robots, is very unrealistic (more realistic: everyone simultaneously dropping dead from poisoning with botulinum toxin delivered by undetectable nanodrones; and yes, even using the nukes would probably not happen).
Yes, I think saying “AGI x-risk” is much more accurate than “AI risk”, in terms of what we are actually referring to. Also worth saying that The Terminator films have the right premise:
[Fast takeoff, instrumental convergent goal of self-preservation]. But everything after this [GIF of Skynet nuking humanity], involving killer robots, is very unrealistic (more realistic: everyone simultaneously dropping dead from poisoning with botulinum toxin delivered by undetectable nanodrones; and yes, even using the nukes would probably not happen).