This isn’t central to the post, but I’m interested in this parenthetical:
(To clarify—the BWC is an arms control treaty that prohibits bioweapons; it is unlikely that we’ll see anything similar with AI (i.e. a complete ban of any “AI weapons”, whatever this means.)
At first glance, a ban on AI weapons research or AI research with military uses seems pretty plausible to me. For example, one could ban research on lethal autonomous weapons systems and research devoted to creating an AGI without banning, e.g., the use of machine learning for image classification or text generation.
Can you say more about why this seems implausible from your point of view?
Good question. I included this disclaimer because to me it seems very hard to define what we exactly mean by an “AI weapon”, which makes a complete ban, like the one the BWC has, implausible.
I think I still don’t quite get why this seems implausible. (For what it’s worth, I think your view is pretty mainstream, so I’m asking about it more to understand how people are thinking about AI and not as any kind of criticism of the post or the parenthetical.)
It seems clear to me that an AI weapon could exist. AI systems designed to autonomously identify and destroy targets seem like a particularly clear example. A ban which distinguishes that technology from nearby civilian technology doesn’t seem much more difficult than distinguishing biological weapons from civilian uses of biological technology.
Of course we’re mostly interested in AGI, not narrower AI technology. I agree that society doesn’t think of AGI development as a weapons technology and so banning “AGI weapons” seems strange to contemplate, but it’s not too difficult to imagine that changing! After all, many of the proponents of the technology are clear that they think it will be the most powerful technology ever invented, granting its creators unprecedented strength. Various components of the US military and intelligence services certainly seems to think AGI development has military implications, so the shift to seeing it as a dual-use weapons technology doesn’t seem to be too big of a leap to imagine.
This isn’t central to the post, but I’m interested in this parenthetical:
At first glance, a ban on AI weapons research or AI research with military uses seems pretty plausible to me. For example, one could ban research on lethal autonomous weapons systems and research devoted to creating an AGI without banning, e.g., the use of machine learning for image classification or text generation.
Can you say more about why this seems implausible from your point of view?
Hey Kerry!
Good question. I included this disclaimer because to me it seems very hard to define what we exactly mean by an “AI weapon”, which makes a complete ban, like the one the BWC has, implausible.
I think I still don’t quite get why this seems implausible. (For what it’s worth, I think your view is pretty mainstream, so I’m asking about it more to understand how people are thinking about AI and not as any kind of criticism of the post or the parenthetical.)
It seems clear to me that an AI weapon could exist. AI systems designed to autonomously identify and destroy targets seem like a particularly clear example. A ban which distinguishes that technology from nearby civilian technology doesn’t seem much more difficult than distinguishing biological weapons from civilian uses of biological technology.
Of course we’re mostly interested in AGI, not narrower AI technology. I agree that society doesn’t think of AGI development as a weapons technology and so banning “AGI weapons” seems strange to contemplate, but it’s not too difficult to imagine that changing! After all, many of the proponents of the technology are clear that they think it will be the most powerful technology ever invented, granting its creators unprecedented strength. Various components of the US military and intelligence services certainly seems to think AGI development has military implications, so the shift to seeing it as a dual-use weapons technology doesn’t seem to be too big of a leap to imagine.