I don’t mean either (1) or (2), but I’m not sure it’s a single argument.
First, I think it’s epistemically and socially healthy for people to separate giving to their community from altruism. To explain a bit more, it’s good to view your community as a valid place to invest effort independent of eventual value. Without that, I think people often end up being exploitative, pushing people to do things instead of treating them respectfully, or being dismissive of others, for example, telling people they shouldn’t be in EA because they aren’t making the right choices. If your community isn’t just about the eventual altruistic value they will create, those failure modes are less likely.
Second, it’s easy to lose sight of eventual goals when focused on instrumental ones, and get stuck in a mode where you are goodharting community size, or dollars being donated—both community size and total dollars seem like an unfortunately easy attractor for this failure.
Third, relatedly, I think that people should be careful not to build models of impact that are too indirect, because they often fail at unexpected places. The simpler your path to impact is, the fewer failure points exist. Community building in many steps removed from the objective, and we should certainly be cautious about doing naïve EV calculations about increasing community size!
Separate but related to community, I think your point about identity, and whether fostering EA as an identity is epistemically healthy, is also relevant to (1).
Your analogy to church spoke very powerfully to me and to something I have always been a bit uncomfortable with. To me, EA is a philosophy/school of thought, and I struggle to understand how a person can “be” a philosophy, or how a philosophy can “recruit members”.
I also suspect that a strong self-perception that one is a “good person” can just as often provide (internal and external) cover for wrong-doing as it can be a motivator to actually do good, as any number of high-profile non-profit scandals (and anecdotal experience from I’m guessing most young women who have ever been involved in a movement for change) can tell you.
I have nothing at all against organic communities, or professional conferences etc, but I also wonder whether there is evidence that building EA as an identity (“join us!”) as opposed to something that people can do is instrumentally effective for first-order causes. Maybe it does, but I think it warrants some interrogation.
> healthy for people to separate giving to their community from altruism.
Is this realistically achievable, with the community we have now? How?
(I imagine it would take a comms team with a social psychology genius and a huge budget, and still would only work partially, and would require very strong buy in from current power players, and a revision of how EA is presented and introduced? but perhaps you think another, leaner and more viable approach is possible?)
>The simpler your path to impact is, the fewer failure points exist
That’s not always true.
Some extreme counter-examples:
a. Programmes on infant stunting keep failing, partly because an overly simple approach has been adopted (intensive infant feeding, Plumpy Nuts etc, with insufficient attention to maternal nutrition, aflatoxin removal, treating parasites in pregnancy, adolescent nutrition, conditional cash transfers etc)
b. A critical path plan was used for Apollo, and worked much better than the simpler Soviet approach, despite being much more complicated.
c. The Brexit Leave campaign SEEMED simple but was actually formed through practice on previous campaigns, and was very sophisticated “under the hood”, which made it hard to oppose.
I don’t mean either (1) or (2), but I’m not sure it’s a single argument.
First, I think it’s epistemically and socially healthy for people to separate giving to their community from altruism. To explain a bit more, it’s good to view your community as a valid place to invest effort independent of eventual value. Without that, I think people often end up being exploitative, pushing people to do things instead of treating them respectfully, or being dismissive of others, for example, telling people they shouldn’t be in EA because they aren’t making the right choices. If your community isn’t just about the eventual altruistic value they will create, those failure modes are less likely.
Second, it’s easy to lose sight of eventual goals when focused on instrumental ones, and get stuck in a mode where you are goodharting community size, or dollars being donated—both community size and total dollars seem like an unfortunately easy attractor for this failure.
Third, relatedly, I think that people should be careful not to build models of impact that are too indirect, because they often fail at unexpected places. The simpler your path to impact is, the fewer failure points exist. Community building in many steps removed from the objective, and we should certainly be cautious about doing naïve EV calculations about increasing community size!
Separate but related to community, I think your point about identity, and whether fostering EA as an identity is epistemically healthy, is also relevant to (1).
Your analogy to church spoke very powerfully to me and to something I have always been a bit uncomfortable with. To me, EA is a philosophy/school of thought, and I struggle to understand how a person can “be” a philosophy, or how a philosophy can “recruit members”.
I also suspect that a strong self-perception that one is a “good person” can just as often provide (internal and external) cover for wrong-doing as it can be a motivator to actually do good, as any number of high-profile non-profit scandals (and anecdotal experience from I’m guessing most young women who have ever been involved in a movement for change) can tell you.
I have nothing at all against organic communities, or professional conferences etc, but I also wonder whether there is evidence that building EA as an identity (“join us!”) as opposed to something that people can do is instrumentally effective for first-order causes. Maybe it does, but I think it warrants some interrogation.