I’m a senior software developer in Canada (earning ~US$70K in a good year) who, being late to the EA party, earns to give. Historically I’ve have a chronic lack of interest in making money; instead I’ve developed an unhealthy interest in foundational software that free markets don’t build because their effects would consist almost entirely of positive externalities.
I dream of making the world better by improving programming languages and developer tools, but AFAIK no funding is available for this kind of work outside academia. My open-source projects can be seen at loyc.net, core.loyc.net, ungglish.loyc.net and ecsharp.net (among others).
After following the Ukraine war closely for almost three years, I naturally also watch China’s potential for military expansionism. Whereas past leaders of China talked about “forceful if necessary” reunification with Taiwan, Xi Jinping seems like a much more aggressive person, one who would actually do it―especially since the U.S. is frankly showing so much weakness in Ukraine. I know this isn’t how EAs are used to thinking, but you have to start from the way dictators think. Xi, much like Putin, seems to idolize the excesses of his country’s communist past, and is a conservative gambler: that is, he will take a gamble if the odds seem enough in his favor. Putin badly miscomputed his odds in Ukraine, but Russia’s GDP and population were 1.843 trillion and 145 million, versus 17.8 trillion and 1.4 billion for China. At the same time, Taiwan is much less populous than Ukraine and its would-be defenders in the USA/EU/Japan are not as strong naval powers as China (yet would have to operate over a longer range). Last but not least, China is the factory of the world―if they should decide they want to do world domination military-style, they can probably do that fairly well while simultaneously selling us vital goods at suddenly-inflated prices.
So when I hear China ramped up nuclear weapon production, I immediately think of it as a nod toward Taiwan. If we don’t want an invasion of Taiwan, what do we do? Liberals have a habit of magical thinking in military matters, talking of diplomacy, complaining about U.S. “war mongers”, and running protests with “No Nukes” signs. But the invasion of Taiwan has nothing to do with the U.S.; Xi simply *wants* Taiwan and has the power to take it. If he makes that decision, no words can stop him. So the Free World has no role to play here other than (1) to deter and (2) to optionally help out Taiwan if Xi invades anyway.
Not all deterrents are military, of course; China and USA will surely do huge economic damage to each other if China invades, and that is a deterrent. But I think China has the upper hand here in ways the USA can’t match. On paper, USA has more military spending, but for practical purposes it is the underdog in a war for Taiwan[1]. Moreover, President Xi surely noticed that all it took was a few comments from Putin about nuclear weapons to close off the possibility of a no-fly-zone in Ukraine, NATO troops on the ground, use of American weapons against Russian territory (for years), etc. So I think Xi can reasonably―and correctly―conclude that China wants Taiwan more than the USA wants to defend it. (To me at least, comments about how we can’t spend more than 4% of the defense budget on Ukraine “because we need to be ready to fight China” just shows how unserious the USA is about defending democracy.) Still, USA aiding Taiwan is certainly a risk for Jinping and I think we need to make that risk look as big and scary as possible.
All this is to say that warfighting isn’t the point―who knows if Trump would even bother. The point is to create a credible deterrent as part of efforts to stop the Free World from shrinking even further. If war comes, maybe we fight, maybe we don’t. But war is more likely whenever dictators think they are stronger than their victims.
I would like more EAs to think seriously about containment, democracy promotion and even epistemic defenses. For what good is it to make people more healthy and prosperous, if those people later end up in a dictatorship that conscripts them or their children to fight wars―including perhaps wars against democracies? (I’m thinking especially of India and the BJP party, here. And yes, it’s still good to help them despitee the risk, I’m just saying it’s not enough and we should have even broader horizons.)
Granted, maybe we can’t do anything. Maybe there’s no tractable and cost-effective thing in this space. There are probably neglected things―like, when the Ukraine war first started, I thought Bryan Caplan’s “make desertion fast” idea was good, and I wish somebody had looked at counterpropaganda operations that could’ve made the concept work. Still, I would like EAs to understand some things.
The risks of geopolitics have returned―basically, cold-war stuff.
EAs overly focus on x-risk and s-risk over catastrophic risk. Technically, c-risk is way less bad than x-risk, but it doesn’t feel less bad. c-risk is more emotionally resonant for people, and risk management tasks probably overlap a lot between the two, so it’s probably easier to connect with policymakers over c-risk than x-risk.
I haven’t heard EAs talk about “loss-of-influence” risk. One form of this would be AGI takeover: if AGIs are much faster, smarter and cheaper than us (whether they are controlled by humans or by themselves), a likely outcome is one in which normal humans have no control over what happens: either AGIs themselves or dictators with AGI armies make all the decisions. In this sense, it sure seems we are the hinge of history, as future humans may have no control, and past humans had an inadequate understanding of their world. But in this post I’m pointing to a more subtle loss of control, where the balance of power shifts toward dictatorships until they decide to invade democracies that are more and more distant from their sphere of influence. If global power shifts toward highly-censored strongman regimes, EAs’ influence could eventually wane to zero.
just my opinion, but this video raises some of the key points