I do not believe this interpretation is correct. Here is the passage again, including the previous paragraph for added context:
Shut down all the large GPU clusters (the large computer farms where the most powerful AIs are refined). Shut down all the large training runs. Put a ceiling on how much computing power anyone is allowed to use in training an AI system, and move it downward over the coming years to compensate for more efficient training algorithms. No exceptions for anyone, including governments and militaries. Make immediate multinational agreements to prevent the prohibited activities from moving elsewhere. Track all GPUs sold. If intelligence says that a country outside the agreement is building a GPU cluster, be less scared of a shooting conflict between nations than of the moratorium being violated; be willing to destroy a rogue datacenter by airstrike.
Frame nothing as a conflict between national interests, have it clear that anyone talking of arms races is a fool. That we all live or die as one, in this, is not a policy but a fact of nature. Make it explicit in international diplomacy that preventing AI extinction scenarios is considered a priority above preventing a full nuclear exchange, and that allied nuclear countries are willing to run some risk of nuclear exchange if that’s what it takes to reduce the risk of large AI training runs.
He advocates for bombing datacentres and being prepared to start shooting conflicts to destroy GPU clusters, and then advocates for “running some risk of nuclear exchange if that’s what it takes to reduce the risk of large AI training runs”. I cannot see any interpretation other than “threaten to bomb nuclear armed countries that train AI’s”.
To be fair, upon reading it again it’s more likely he means “threaten to conventionally bomb datacentres”. But this is still nuclear brinksmanship: bombing russia or china is an act of war, carrying a high chance of nuclear exchange.
I do not believe this interpretation is correct. Here is the passage again, including the previous paragraph for added context:
He advocates for bombing datacentres and being prepared to start shooting conflicts to destroy GPU clusters, and then advocates for “running some risk of nuclear exchange if that’s what it takes to reduce the risk of large AI training runs”. I cannot see any interpretation other than “threaten to bomb nuclear armed countries that train AI’s”.
To be fair, upon reading it again it’s more likely he means “threaten to conventionally bomb datacentres”. But this is still nuclear brinksmanship: bombing russia or china is an act of war, carrying a high chance of nuclear exchange.
Your post begins with,
And ends with,
If in the writing of a comment you realize that you were wrong, you can just say that.