This goes considerably beyond ‘international treaties with teeth are plausibly necessary here’… Eliezer is proposing attacks on any countries that are building AI-above-a-certain-level, whether or not they sign up to the treaty.
Is this actually inconsistent? If a country doesn’t sign up for the Biological Weapons Convention, and then acts in flagrant disregard of it, would they not be expected to be faced with retaliatory action from signatories, including, depending on specifics, plausibly up to military force? My sense was that people who pushed for the introduction and enforcement of the BWC would have imaged such a plausible response as within bounds.
I also think “with teeth” kind of obscures by abstraction here (since it doesn’t necessarily sound like it means war/violence, but that’s what’s being proposed.)
I don’t think what Eliezer is proposing would necessarily mean war/violence either – conditional on a world actually getting to the point were major countries are agreeing to such a treaty, I find it plausible that smaller countries would simply acquiesce in shutting down rogue datacenters. If they didn’t, before military force was used, diplomacy would be used. Then probably economic sanctions would be used. Eliezer is saying that governments should be willing to escalate to using military force if necessary, but I don’t think it’s obvious that in such a world military force would be necessary.
Yep, I +1 this response. I don’t think Eliezer is proposing anything unusual (given the belief that AGI is more dangerous than nukes, which is a very common belief in EA, though not universally shared). I think the unusual aspect is mostly just that Eliezer is being frank and honest about what treating AGI development and proliferation like nuclear proliferation looks like in the real world.
Is this actually inconsistent? If a country doesn’t sign up for the Biological Weapons Convention, and then acts in flagrant disregard of it, would they not be expected to be faced with retaliatory action from signatories, including, depending on specifics, plausibly up to military force? My sense was that people who pushed for the introduction and enforcement of the BWC would have imaged such a plausible response as within bounds.
I don’t think what Eliezer is proposing would necessarily mean war/violence either – conditional on a world actually getting to the point were major countries are agreeing to such a treaty, I find it plausible that smaller countries would simply acquiesce in shutting down rogue datacenters. If they didn’t, before military force was used, diplomacy would be used. Then probably economic sanctions would be used. Eliezer is saying that governments should be willing to escalate to using military force if necessary, but I don’t think it’s obvious that in such a world military force would be necessary.
Yep, I +1 this response. I don’t think Eliezer is proposing anything unusual (given the belief that AGI is more dangerous than nukes, which is a very common belief in EA, though not universally shared). I think the unusual aspect is mostly just that Eliezer is being frank and honest about what treating AGI development and proliferation like nuclear proliferation looks like in the real world.
He explained his reasons for doing that here:
https://twitter.com/ESYudkowsky/status/1641452620081668098
https://www.lesswrong.com/posts/Lz64L3yJEtYGkzMzu/rationality-and-the-english-language