Furthering your “worse than Terminator” reframing in your Superintelligence section, I will quote Yudkowsky there (it’s said in jest, but the message is straightforward):
Dear journalists: Please stop using Terminator pictures in your articles about AI. The kind of AIs that smart people worry about are much scarier than that! Consider using an illustration of the Milky Way with a 20,000-light-year spherical hole eaten away.
Here, “AI risk is not like Terminator” attempts to dismiss the eventuality of a fair fight… and rhetorically that could be reframed as “yes, think Terminator except much more lopsided in favor of Skynet. Granted, the movies would have been shorter that way”.
Furthering your “worse than Terminator” reframing in your Superintelligence section, I will quote Yudkowsky there (it’s said in jest, but the message is straightforward):
Here, “AI risk is not like Terminator” attempts to dismiss the eventuality of a fair fight… and rhetorically that could be reframed as “yes, think Terminator except much more lopsided in favor of Skynet. Granted, the movies would have been shorter that way”.