Two criticisms:
On two occasions you referred to nuclear war as an “existential risk”. It’s not. You also referred to 1970s-tier bioweapons as an “existential risk”; they weren’t. Both are GCRs but not X; there have never been enough nukes to kill all humans and even infectious diseases will have R drop below 1 before population density drops to 0. We are at a point now where biotechnology is beginning to pose notable X-risk, but we weren’t then.
You mentioned that the communities you reference, and EA/Rats, are overwhelmingly male, but you do not make any actual argument about how this is relevant. Do remember that a non-trivial fraction of Rats are not feminists, and this pings their “hostile politics” detectors (as does the editing of the quote from “men” to “people”); that’s a loss in persuasiveness, which should be avoided unless you need it to make some sort of point.
I don’t think world dystopia is entirely necessary, but a successful long stop for AI (the ~30+ years it’ll probably take) is probably going to require knocking over a couple of countries that refuse to play ball. It seems fairly hard to keep even small countries from setting up datacentres and chip factories except by threatening or using military force.
To be clear, I think that’s worth it. Heck, nuclear war would be worth it if necessary, although I’m not sure it will be—the PRC in particular I rate as >50% either a) agreeing to a stop, and/or b) getting destroyed in non-AI-related nuclear war in the next few years.