Effective Local Altruism

Nate Silver and Zvi Mowshowitz object to EA’s impartiality: treating everyone equally—no matter if they are far away—means spending resources in a way that puts you and “your people” at a disadvantage, which may cause issues later.

From https://​​thezvi.wordpress.com/​​2024/​​09/​​27/​​book-review-on-the-edge-the-future/​​#more-23968 :

...But that is not the only reason you need spatial and knowledge-based partiality.

Civilization would not run, people would not survive or reproduce or even produce, the social contract would collapse, if you did not favor and exchange with and cooperate uniquely with those around you beyond what you do with strangers halfway around the world. All that, and real competition, is necessary. Those strangers are not only people too but also certified Popes, so please treat them right, but that does not mean full equal standing. The alternative is not game theory compatible, it is not fit, it does not long survive.

There is little virtue in being too virtuous to sustain that virtue, and indeed if that is a thing you are thinking of as virtuous than you have chosen your virtues poorly.

[Silver:] And even if I think there’s something honorable about acting morally in a mostly selfish world, I also wonder about the long-term evolutionary fitness of some group of people who wouldn’t defend their own self-interest, or that of their family, their nation, their species, or even their planet, without at least a little more vigor than they would that of a stranger. I want the world to be less partial than it is, but I want it to be at least partially partial. (6653)

Yep.

My main response to this is that more prosperous, innovative, technologically advanced, and free countries around the world is good for “my group” and me personally, though I would be sympathetic to arguments that we should focus our global development money on countries that are more likely to become US allied democracies than the ones which will become CCP-allied dictatorships.

A secondary response is that instead of Zvi and Nate saying “we aren’t EAs because we think its important to care for local issues so our altruism can be sustainable and strengthen ourselves and our communities” there should be a branch of EA that says “we are EAs who look for the most effective interventions locally, making our altruism sustainable and strengthening ourselves and our communities.”

The logical conclusion I see from the thinking above is to create GiveWell_USA, GiveWell_Canada, GiveWell_NewYork… etc. What are the most tractable+important+neglected charitable funding opportunities for improving the health+wealth of the USA/​Canada/​New York/​etc?

Even more locally, let’s consider the University level. I’m at Northwestern University. If I were to pick a change to NU that would most strengthen us for the least money, an idea would be be buying a ton of dry-erase markers to put in all the classrooms with white boards (I’ve been in multiple classes where the professor tries to write but finds all the markers are dried out!) Buying and donating dry-erase markers for NU could be considered a Local EA intervention. I’ve also heard some good things about UV disinfectant lights, but I haven’t looked into it much.

At a larger scale, YIMBYs and other activists are trying to strengthen their communities, though often in an innumerate way, and often they become trapped in intractable politics.

Zvi’s work fighting the Jones Act can be considered a sort of Effective Local Altruism for the US, but that’s not the sort of project that can absorb large amounts of funding.

If I wanted to strengthen the US with my donation, what should I do with it? I don’t know! I wish I did! At a minimum, donating to underfunded schools seems like a solid choice: there would be no political battles and it would result in smart and educated people who are healthier and more productive. A real answer would involve lots of spreadsheets.

And if you decide where to donate based off of the expected impact calculations in a spreadsheet, then according to me, you’re an EA.