Inward vs. Outward Focused Altruism

The idea for this post originally arose when I was thinking about how to steelman localism. EA typically directs most of its effort outside of itself. This is unsurprising since our goal is to do what is best for the world, not ourselves.

But what happens if we adopt a long-termist perspective, with at least medium-term timelines and a reasonable ability for a community to recursively self-improve? Then depending on the exact values, spending most of the time investing in one’s own community might actually be the best strategy.

Of course, there’d be social effects pushing heavily against this in a community that presents itself as altruistic. From the outside, this wouldn’t look very altruistic at all. It could very easily be portrayed as self-deluded or a giant ponzi scheme and there’s almost nothing humans hate as much as hypocrisy. Nonetheless, I believe that this is a strategy that at least warrants consideration.

Of course, there’s a lot the above arguments leave out: Improving the world isn’t just about having accumulated resources and capabilities, but also about practical experience, which is built up over time. Self-recursive improvement can easily recurse forever, and never actually produce any impact. Having a long waiting period increases worries about value drift. And, of course, there is always significant ‘leakage’ with people leaving the community, not utilising their capabilities or pursuing their own goals.

But beyond this, the biggest consideration for the Effective Altruism community is the follows: EA has found a niche. Even if an inwards focus (that is, recursive self-improvement) was conclusively shown to be more effective than the traditional outwards focus, it would likely be a mistake for EA to pursue this. To do so would be to throw away a vast amount of cultural capital, structure and accumulated knowledge, which would be rather hard to justify when such a project could very well be pursued under the banner of a new group.

Would such self-recursive improvement actually be viable? This is something I’m deeply uncertain about. In particular, my biggest worry would be leakage. Whenever you have a project that levels people up, the incentive will be to join to receive training/​resources and then leave to pursue one’s own goal. This can occur as either an intentional exploit or as a sudden loss of motivation after one has received the benefits. This is a very thorny problem, especially since many potential solutions can easily result in a community becoming excessively cult-like.

No comments.