After this discussion (andespecially based on Greg’s comment), I would revise my point as follows:
The AI might kill us because 1) it sees us as a threat (most likely), 2) it uses up our resources/environment for its own purposes (somewhat likely), or 3) it converts all matter into whatever it deems useful instantly (seems less likely to me but still not unlikely).
I think common framings typically omit point 2, and overemphasize and overdramatize point 3 relative to point 1. We should fix that.
Is this is an overly pedantic nitpick? If you’re making claims that strongly violate most people’s priors, it’s not sufficient to be broadly correct. People will look at what you say and spot-check your reasoning. If the spot-check fails, they won’t believe what you’re saying, and it doesn’t matter if the spot-check is about a practically irrelevant detail as long as they perceive the detail to be sufficiently important to the overall picture.
I also have a bit of an emotional reaction along the lines of: Man, if you go around telling people how they personally are going to be killed by AGI, you better be sure that your story is correct.
I’m not confident, sorry for implying otherwise.
After this discussion (andespecially based on Greg’s comment), I would revise my point as follows:
The AI might kill us because 1) it sees us as a threat (most likely), 2) it uses up our resources/environment for its own purposes (somewhat likely), or 3) it converts all matter into whatever it deems useful instantly (seems less likely to me but still not unlikely).
I think common framings typically omit point 2, and overemphasize and overdramatize point 3 relative to point 1. We should fix that.
Is this is an overly pedantic nitpick? If you’re making claims that strongly violate most people’s priors, it’s not sufficient to be broadly correct. People will look at what you say and spot-check your reasoning. If the spot-check fails, they won’t believe what you’re saying, and it doesn’t matter if the spot-check is about a practically irrelevant detail as long as they perceive the detail to be sufficiently important to the overall picture.
I also have a bit of an emotional reaction along the lines of: Man, if you go around telling people how they personally are going to be killed by AGI, you better be sure that your story is correct.