However, “AI accidents” don’t communicate the scale of a possible disaster. Something like “global catastrophic AI accidents” may be even clearer. Or “permanent loss of control of a hostile AI system”.
“permanent loss of control of a hostile AI system”—This seems especially facilitative of the science-fiction interpretation to me.
I agree with the rest.
However, “AI accidents” don’t communicate the scale of a possible disaster. Something like “global catastrophic AI accidents” may be even clearer. Or “permanent loss of control of a hostile AI system”.
“permanent loss of control of a hostile AI system”—This seems especially facilitative of the science-fiction interpretation to me.
I agree with the rest.