Yep, I +1 this response. I don’t think Eliezer is proposing anything unusual (given the belief that AGI is more dangerous than nukes, which is a very common belief in EA, though not universally shared). I think the unusual aspect is mostly just that Eliezer is being frank and honest about what treating AGI development and proliferation like nuclear proliferation looks like in the real world.
Yep, I +1 this response. I don’t think Eliezer is proposing anything unusual (given the belief that AGI is more dangerous than nukes, which is a very common belief in EA, though not universally shared). I think the unusual aspect is mostly just that Eliezer is being frank and honest about what treating AGI development and proliferation like nuclear proliferation looks like in the real world.
He explained his reasons for doing that here:
https://twitter.com/ESYudkowsky/status/1641452620081668098
https://www.lesswrong.com/posts/Lz64L3yJEtYGkzMzu/rationality-and-the-english-language