(Context: I accept the following philosophy almost hook-line-and-sinker.)
I mean the amount of mental room the philosophy takes up: the uncapped demands, the absence of an act/omission distinction, the absence of a notion of “enough” or supererogation, the astronomical stakes, and the sweeping devaluation of ineffective things.
Consider the strong Singer principle:
“If it is in our power to prevent something bad from happening, without thereby sacrificing anything of comparable moral importance, we ought, morally, to do it.”
and even the weak Singer principle:
“If it is in our power to prevent something very bad from happening, without sacrificing anything morally significant, we ought, morally, to do it.”
This is in some sense much broader than most sets of deontological rules: it applies to more of your actions, and for practical purposes neither constraint ever stops being active. “Not helping is the same [value] as harming.” The work is never done. It can take over your whole life if you let it, and I know people who did.
Throw in the vast, humbling scale of long-termism and you have something which can totally occupy you indefinitely.
What’s wrong with that, if the philosophy is true/good? Well, as noted, some people are vulnerable and suffer a lot from this attitude to their life, to the point where they actually don’t do the most good despite trying appallingly hard. But even among people without serious scrupulosity I think a naive approach to these ideas can be harmful: for instance, in making the community more homogeneous and likely to miss important information or values, or in causing subtle burnout on the scale of years.
Now, the fair response is that the community version is quite different from the pure philosophy, and that anyway the community’s philosophers are uniquely focussed on moral uncertainty, and that EA leaders repeatedly emphasise mental health, sustainability, and groundedness.
But I notice that for the sorts of philosophically serious young people the movement moves through, these ideas get taken up without these caveats and hedges. My anecdotal observation is that people do far better with some other anchor (social or emotional or intellectual). Personally better, and literally morally better.
It’s been an ongoing discussion at SPARC and ESPR to try to decide how much or how little exposure to EA (as opposed to “EA ideas”) we want to make explicit during the camps. So far the answer has been mostly “fairly little,” and of course we do focus quite a lot on frank conversations about the ups and downsides. But it’s definitely difficult to pass down wisdom rather than just knowledge, and some of the questions have no genuine or easy answer. Thinking on this is certainly something that keeps me up at night every so often.
A saving grace is that the theory automatically adjusts its demands to what is sustainable; if X is actually too demanding, then X stops being a demand. But this is subtle stuff (how do I know it’s actually too demanding without first burning out?) and I think many people mishandle it. And so I balk at pushing a theory or practice so easy to mishandle.
“Best handled by people firmly grounded in some other community or field or worldview.”
I’d be interested in you fleshing out a bit more what bring grounded in this way looks like to you. (E.g. what are some examples of communities/fields/worldviews EAs you know have successfully done this with?)
Not totally sure what I mean. Something like regularisation, helping to dampen fanaticism, unilateralism, information cascades, goodharting.
Complexity science, secular Buddhism, Christianity, neoliberalism, libertarianism, phenomenology, humanism (hallo Swiss EAs), transhumanism, environmentalism, various conflict theories (Marxism, IR realism, feminism, game theory, public choice theory), the hermeneutics of suspicion, global justice, basic goods, capability theory, entrepreneurship, patriotism, evo psych, circling, Toastmasters, punk, weird Twitter. (I’m including a couple adjacent people who know EA and work with EAs but don’t call themselves EAs.)
This is besides a thousand fandoms and hobbies (which help with social grounding but less on intellectual grounding).
Thanks. This was interesting and I think I buy that this can be really important. This comment actually gave me an interesting new frame on some of the benefits I’ve gotten from having a history with punk music.
In some ways I think you get this for free by being old enough to have graduated from uni before EA existed. I hadn’t exactly appreciated this as a way my experience differs from younger EAs’.
Like Howie, the past year of intense community building and working with young people have made me reflect on my age + relationship with my family + Jewish community and how those have been potentially more grounding than I gave them credit for. My college friends are a perfect blend of “buy a bunch of EA ideas” and “aren’t in the community” and “skeptical of some stuff” and “really fucking smart” to call me out on nonsense.
Rationality was a deep love of mine, at 17! But not the only love.
“EA is an aggressive set of memes. Best handled by people firmly grounded in some other community or field or worldview.”
What do you mean?
(Context: I accept the following philosophy almost hook-line-and-sinker.)
I mean the amount of mental room the philosophy takes up: the uncapped demands, the absence of an act/omission distinction, the absence of a notion of “enough” or supererogation, the astronomical stakes, and the sweeping devaluation of ineffective things.
Consider the strong Singer principle:
“If it is in our power to prevent something bad from happening, without thereby sacrificing anything of comparable moral importance, we ought, morally, to do it.”
and even the weak Singer principle:
“If it is in our power to prevent something very bad from happening, without sacrificing anything morally significant, we ought, morally, to do it.”
This is in some sense much broader than most sets of deontological rules: it applies to more of your actions, and for practical purposes neither constraint ever stops being active. “Not helping is the same [value] as harming.” The work is never done. It can take over your whole life if you let it, and I know people who did.
Throw in the vast, humbling scale of long-termism and you have something which can totally occupy you indefinitely.
What’s wrong with that, if the philosophy is true/good? Well, as noted, some people are vulnerable and suffer a lot from this attitude to their life, to the point where they actually don’t do the most good despite trying appallingly hard. But even among people without serious scrupulosity I think a naive approach to these ideas can be harmful: for instance, in making the community more homogeneous and likely to miss important information or values, or in causing subtle burnout on the scale of years.
Now, the fair response is that the community version is quite different from the pure philosophy, and that anyway the community’s philosophers are uniquely focussed on moral uncertainty, and that EA leaders repeatedly emphasise mental health, sustainability, and groundedness.
But I notice that for the sorts of philosophically serious young people the movement moves through, these ideas get taken up without these caveats and hedges. My anecdotal observation is that people do far better with some other anchor (social or emotional or intellectual). Personally better, and literally morally better.
It’s been an ongoing discussion at SPARC and ESPR to try to decide how much or how little exposure to EA (as opposed to “EA ideas”) we want to make explicit during the camps. So far the answer has been mostly “fairly little,” and of course we do focus quite a lot on frank conversations about the ups and downsides. But it’s definitely difficult to pass down wisdom rather than just knowledge, and some of the questions have no genuine or easy answer. Thinking on this is certainly something that keeps me up at night every so often.
A saving grace is that the theory automatically adjusts its demands to what is sustainable; if X is actually too demanding, then X stops being a demand. But this is subtle stuff (how do I know it’s actually too demanding without first burning out?) and I think many people mishandle it. And so I balk at pushing a theory or practice so easy to mishandle.
You explained what you meant, anticipated my objection, and provided a follow-up. I largely agree. Thank you!
“Best handled by people firmly grounded in some other community or field or worldview.”
I’d be interested in you fleshing out a bit more what bring grounded in this way looks like to you. (E.g. what are some examples of communities/fields/worldviews EAs you know have successfully done this with?)
Not totally sure what I mean. Something like regularisation, helping to dampen fanaticism, unilateralism, information cascades, goodharting.
Complexity science, secular Buddhism, Christianity, neoliberalism, libertarianism, phenomenology, humanism (hallo Swiss EAs), transhumanism, environmentalism, various conflict theories (Marxism, IR realism, feminism, game theory, public choice theory), the hermeneutics of suspicion, global justice, basic goods, capability theory, entrepreneurship, patriotism, evo psych, circling, Toastmasters, punk, weird Twitter. (I’m including a couple adjacent people who know EA and work with EAs but don’t call themselves EAs.)
This is besides a thousand fandoms and hobbies (which help with social grounding but less on intellectual grounding).
Thanks. This was interesting and I think I buy that this can be really important. This comment actually gave me an interesting new frame on some of the benefits I’ve gotten from having a history with punk music.
In some ways I think you get this for free by being old enough to have graduated from uni before EA existed. I hadn’t exactly appreciated this as a way my experience differs from younger EAs’.
Implies that we should have non-focus universities as well as focus universities. Nature reserves. Strategic groupthink antidote reserve.
(99% joking)
Like Howie, the past year of intense community building and working with young people have made me reflect on my age + relationship with my family + Jewish community and how those have been potentially more grounding than I gave them credit for. My college friends are a perfect blend of “buy a bunch of EA ideas” and “aren’t in the community” and “skeptical of some stuff” and “really fucking smart” to call me out on nonsense.
Rationality was a deep love of mine, at 17! But not the only love.