I often get the sense that EA community builders (university/city/regional/national group organizers) don’t spend enough time thinking about EA ideas*. It might be because of a selection bias on capabilities, the nature of community building, or just time-constraints; though that’s not so important for my question.
I ask this question here so that EA community builders (including myself) can avoid making mistakes from not having thought about EA enough. The main mistakes I’m concerned about are: causing reputational damage, spreading a wrong version of EA, over-focusing or under-focusing on certain cause areas, giving bad career advice, etc., though there are probably many more kinds of mistakes.
*By “EA ideas” here I refer to a much broader class of ideas than usually presented in an Intro Program or Fellowship. Feel free to insert your own notion of what “EA ideas” mean here.
Scrupulosity (moral OCD; concern taken to such an extreme that you hurt yourself); the risk in pushing EA to people like this.
Also that—almost regardless of the friendliness and moderation with which you present it—EA is an aggressive set of memes. Best handled by people firmly grounded in some other community or field or worldview.
“EA is an aggressive set of memes. Best handled by people firmly grounded in some other community or field or worldview.”
What do you mean?
(Context: I accept the following philosophy almost hook-line-and-sinker.)
I mean the amount of mental room the philosophy takes up: the uncapped demands, the absence of an act/omission distinction, the absence of a notion of “enough” or supererogation, the astronomical stakes, and the sweeping devaluation of ineffective things.
Consider the strong Singer principle:
“If it is in our power to prevent something bad from happening, without thereby sacrificing anything of comparable moral importance, we ought, morally, to do it.”
and even the weak Singer principle:
“If it is in our power to prevent something very bad from happening, without sacrificing anything morally significant, we ought, morally, to do it.”
This is in some sense much broader than most sets of deontological rules: it applies to more of your actions, and for practical purposes neither constraint ever stops being active. “Not helping is the same [value] as harming.” The work is never done. It can take over your whole life if you let it, and I know people who did.
Throw in the vast, humbling scale of long-termism and you have something which can totally occupy you indefinitely.
What’s wrong with that, if the philosophy is true/good? Well, as noted, some people are vulnerable and suffer a lot from this attitude to their life, to the point where they actually don’t do the most good despite trying appallingly hard. But even among people without serious scrupulosity I think a naive approach to these ideas can be harmful: for instance, in making the community more homogeneous and likely to miss important information or values, or in causing subtle burnout on the scale of years.
Now, the fair response is that the community version is quite different from the pure philosophy, and that anyway the community’s philosophers are uniquely focussed on moral uncertainty, and that EA leaders repeatedly emphasise mental health, sustainability, and groundedness.
But I notice that for the sorts of philosophically serious young people the movement moves through, these ideas get taken up without these caveats and hedges. My anecdotal observation is that people do far better with some other anchor (social or emotional or intellectual). Personally better, and literally morally better.
It’s been an ongoing discussion at SPARC and ESPR to try to decide how much or how little exposure to EA (as opposed to “EA ideas”) we want to make explicit during the camps. So far the answer has been mostly “fairly little,” and of course we do focus quite a lot on frank conversations about the ups and downsides. But it’s definitely difficult to pass down wisdom rather than just knowledge, and some of the questions have no genuine or easy answer. Thinking on this is certainly something that keeps me up at night every so often.
A saving grace is that the theory automatically adjusts its demands to what is sustainable; if X is actually too demanding, then X stops being a demand. But this is subtle stuff (how do I know it’s actually too demanding without first burning out?) and I think many people mishandle it. And so I balk at pushing a theory or practice so easy to mishandle.
You explained what you meant, anticipated my objection, and provided a follow-up. I largely agree. Thank you!
“Best handled by people firmly grounded in some other community or field or worldview.”
I’d be interested in you fleshing out a bit more what bring grounded in this way looks like to you. (E.g. what are some examples of communities/fields/worldviews EAs you know have successfully done this with?)
Not totally sure what I mean. Something like regularisation, helping to dampen fanaticism, unilateralism, information cascades, goodharting.
Complexity science, secular Buddhism, Christianity, neoliberalism, libertarianism, phenomenology, humanism (hallo Swiss EAs), transhumanism, environmentalism, various conflict theories (Marxism, IR realism, feminism, game theory, public choice theory), the hermeneutics of suspicion, global justice, basic goods, capability theory, entrepreneurship, patriotism, evo psych, circling, Toastmasters, punk, weird Twitter. (I’m including a couple adjacent people who know EA and work with EAs but don’t call themselves EAs.)
This is besides a thousand fandoms and hobbies (which help with social grounding but less on intellectual grounding).
Thanks. This was interesting and I think I buy that this can be really important. This comment actually gave me an interesting new frame on some of the benefits I’ve gotten from having a history with punk music.
In some ways I think you get this for free by being old enough to have graduated from uni before EA existed. I hadn’t exactly appreciated this as a way my experience differs from younger EAs’.
Implies that we should have non-focus universities as well as focus universities. Nature reserves. Strategic groupthink antidote reserve.
(99% joking)
Like Howie, the past year of intense community building and working with young people have made me reflect on my age + relationship with my family + Jewish community and how those have been potentially more grounding than I gave them credit for. My college friends are a perfect blend of “buy a bunch of EA ideas” and “aren’t in the community” and “skeptical of some stuff” and “really fucking smart” to call me out on nonsense.
Rationality was a deep love of mine, at 17! But not the only love.
I think that the poor outcomes you listed—causing reputational damage, spreading a wrong version of EA, over-focusing or under-focusing on certain cause areas, giving bad career advice, etc… - are on the mark, but might not entirely stem from EA community builders not taking the time to understand EA principles.
For example, I can imagine a scenario where an EA community builder is not apt at presenting cause-areas, but understands the cause-area landscape very well. Perhaps as a result of their poor communication skills (and maybe also from a lack of being self-assured), some individuals on the edge of adopting EA in this particular org. begin to doubt the EA community builder and eventually leave.
Back to the question. I think that group leaders, including EA community builders, don’t often take the time to empathize or comprehend what the topic of the group means to each of the members in the group.
The question of how this organization, movement, cause, etc… (in this case EA, and EA cause-areas) fits into group member X’s life is useful in that it can be predictive of how committed they are, or of how long they’ll stick around.
In my personal experience coming to understand EA, and in my experience helping others at my undergraduate institution understand EA principles, I have noticed that there are a few close (highly involved) individuals, and many other, less involved individuals in the periphery.
Many times, a lot of work that was expended by the highly involved individuals in trying to get the less involved individuals more involved could have been prevented by communicating the content of the group’s activities more clearly. Regularly making sure that everyone is on the same page (literally just bring it up in conversation) can help to reduce the damage caused by the EA community builder.
Practically speaking, exercises that would likely achieve this outcome of better communication could be: asking each member of the group what EA means to them, having each member present their case or analysis for why their cause-area is more pressing than other cause-areas, and having anonymous surveys to make sure there is a consensus among the group on their understanding of EA principles and making an impact.