This post is a personal reflection on certain attitudes I have encountered in the EA community that I believe can be misleading. It is primarily based on intuition, not thorough research and surveys.
I have heard from several women what they dislike about the EA community and this post is what I have taken from those conversations. I think that if we can move more in the direction I’m describing, the EA community can become warmer and more welcoming to all genders and races (and also more effective at doing good).
I’d like to note that I don’t think what I’m about to describe is a widespread problem, but a phenomenon that may occur in some places. Most of my experiences with the EA community have been very positive. I meet mostly caring people with whom I can have interesting, sometimes controversial discussions. And I often meet people who are very willing to help.
Now to the subject:
Some women I have spoken to have described a “lack of empathy” in the group, or, more specifically, that EA people came across as “tech bros” who lacked humility and wouldn’t help a stranger because it wouldn’t be the most effective thing to do. In an introductory discussion group we ran (in our university group), one of the participants perceived some of EA’s ideas as “cold-hearted” and was very critical of the abstract, sometimes detached way of trying to calculate how to do good most effectively.
I believe that these impressions and experiences point to risks associated with certain EA-related ideas.
The idea of optimization
Firstly, the idea of optimising/maximising one’s impact is fraught with risks, which have been described already here, here and here (and maybe elsewhere, too).
To judge between actions or causes as more or less worthy of our attention can certainly seem cold-hearted. While this approach is valuable for triage and for prioritising in difficult situations, it also has a dark side when it justifies not caring about what we might normally care about. We should not discredit what might be judged as lesser goods just because some metric suggests it. It shouldn’t lead us to lose our humility (impacts are uncertain and we are not omniscient) as well as our sense of caring.
What kind of community are we if people don’t feel comfortable talking about their private lives because they don’t optimise everything, don’t spend their free time researching or trying to make a difference? When people think that spending time volunteering for less effective non-profits might not be valued or even dismissed? What is the point of an ineffective soup kitchen, after all it is a waste of time in terms of improving QALYs?
I have no doubt that even the thought of encountering such insensitive comments makes you feel uncomfortable.
The following quote might appear to conflict with the goal of EA, but I think it doesn’t and makes and important point.
“There is no hierarchy of compassionate action. Based on our interests, skills and what truly moves us, we each find our own way, helping to alleviate suffering in whatever way we can.”—Joseph Goldstein (2007) in A Heart Full of Peace
What we are trying to do is called Effective Altruism, not Altruistic Effectiveness, and we should be trying to be altruistic in the first place, that is, good and caring people.[1]
The idea of focusing on consequences
I also think that an exaggerated focus on consequences can be misleading in a social context, as well as detrimental in terms of personal well-being. Even if one supports consequentialism, focusing on consequences may not be the best strategy for achieving them.
One reason is that, as Stoic philosophy tells us, we can’t control the outcomes of our actions.
Another is that if we cling to them, they can distract us from what it means to live an ethical life. When we focus on consequences rather than valuing effort and intention, what it means to be a good person is subject to considerable moral luck.
I think this focus on consequences, which is widespread in EA, can also lead to unhealthy social dynamics. One might be tempted to value effectiveness over kindness, intelligence over caring. I have heard some people say something like “well, I am not that impactful”, with a painful touch of imposter syndrome, comparing themselves to other people who have done more impressive things within the EA movement.
Even when we try to see each other as a team working towards a common goal of making the world a better place, we often can’t help but form opinions of other people by judging them based on what we value. We are animals who play status games, even when we may not like the idea. So our shared values, what we care about, are important to our community. And I think that these ideas, effectiveness and a focus on consequences, if overly endorsed, can make a community less welcoming and less warm-hearted. This starts with whether we frame and explain EA in the abstract as “trying to do good as effectively as possible” or as “trying to help people and animals as well as we can”, and ends with the values we consider most important.
Conclusions
In conclusion, I think that rather than being overly focused on finding the most effective means of doing good, we should also be concerned with becoming more altruistic, caring and compassionate. We should not neglect to care about things that are difficult or impossible to measure by focusing on reason and rationality. We must remain humble, grounded and warm-hearted. Even when dealing with important technical issues such as AI safety or other areas of long-term concern (and especially in those cases), it is important to nurture human qualities and not allow them to recede into the background. I’m not trying to devalue these causes, but they might become an easy way to rationalise away and disconnect from the suffering that is happening right now. Because it can be painful to open up to and connect with suffering, it can be tempting to find a good reason not to.
EA isn’t about turning off the heart because it might lead to bias, it’s about turning on the head. And when they are in conflict, we need to stop and think carefully about what price we might pay in trying to be as effective as possible. Good and caring actions that may seem ineffective can be very important for reasons that we cannot put into a metric.
We can and should learn not only how to think more rationally, but also how to become more caring, for compassion is a skill we can train. This quality of caring, the intention to help, is the soil on which EA can grow. It is linked to our personal and interpersonal well-being and we should emphasise the importance of nurturing it rather than depleting it by becoming too attached to certain ideas.[2][3][4][5]
A few resources on practicing compassion
Practicing mindfulness (A wandering mind is a less caring mind[6])
Practicing to care (e.g. through loving-kindness meditation)[7][8]
I like to thank Ysa Bourgine for her reflections and ideas that contributed to this post and everyone else with whom I had good conversations about these topics.
Jazaieri H, Jinpa GT, McGonigal K, et al. Enhancing Compassion: A Randomized Controlled Trial of a Compassion Cultivation Training Program. J Happiness Stud. 2013;14(4):1113-1126. doi:10.1007/S10902-012-9373-Z/TABLES/3
Jazaieri H, McGonigal K, Lee IA, et al. Altering the Trajectory of Affect and Affect Regulation: the Impact of Compassion Training. Mindfulness (N Y). 2018;9(1):283-293. doi:10.1007/S12671-017-0773-3/FIGURES/5
Quaglia JT, Soisson A, Simmer-Brown J. Compassion for self versus other: A critical review of compassion training research. J Posit Psychol. 2021;16(5):675-690. doi:10.1080/17439760.2020.1805502
Klimecki OM, Leiberg S, Ricard M, Singer T. Differential pattern of functional brain plasticity after compassion and empathy training. Soc Cogn Affect Neurosci. 2014;9(6):873-879. doi:10.1093/SCAN/NST060
Jazaieri H, Lee IA, McGonigal K, et al. A wandering mind is a less caring mind: Daily experience sampling during compassion meditation training. J Posit Psychol. 2016;11(1):37-50. doi:10.1080/17439760.2015.1025418
It is called Effective Altruism, not Altruistic Effectiveness
This post is a personal reflection on certain attitudes I have encountered in the EA community that I believe can be misleading. It is primarily based on intuition, not thorough research and surveys.
It is not news that the EA community has an unbalanced demographic, with men in the majority.
I have heard from several women what they dislike about the EA community and this post is what I have taken from those conversations. I think that if we can move more in the direction I’m describing, the EA community can become warmer and more welcoming to all genders and races (and also more effective at doing good).
I’d like to note that I don’t think what I’m about to describe is a widespread problem, but a phenomenon that may occur in some places. Most of my experiences with the EA community have been very positive. I meet mostly caring people with whom I can have interesting, sometimes controversial discussions. And I often meet people who are very willing to help.
Now to the subject:
Some women I have spoken to have described a “lack of empathy” in the group, or, more specifically, that EA people came across as “tech bros” who lacked humility and wouldn’t help a stranger because it wouldn’t be the most effective thing to do. In an introductory discussion group we ran (in our university group), one of the participants perceived some of EA’s ideas as “cold-hearted” and was very critical of the abstract, sometimes detached way of trying to calculate how to do good most effectively.
I believe that these impressions and experiences point to risks associated with certain EA-related ideas.
The idea of optimization
Firstly, the idea of optimising/maximising one’s impact is fraught with risks, which have been described already here, here and here (and maybe elsewhere, too).
To judge between actions or causes as more or less worthy of our attention can certainly seem cold-hearted. While this approach is valuable for triage and for prioritising in difficult situations, it also has a dark side when it justifies not caring about what we might normally care about. We should not discredit what might be judged as lesser goods just because some metric suggests it. It shouldn’t lead us to lose our humility (impacts are uncertain and we are not omniscient) as well as our sense of caring.
What kind of community are we if people don’t feel comfortable talking about their private lives because they don’t optimise everything, don’t spend their free time researching or trying to make a difference? When people think that spending time volunteering for less effective non-profits might not be valued or even dismissed? What is the point of an ineffective soup kitchen, after all it is a waste of time in terms of improving QALYs?
I have no doubt that even the thought of encountering such insensitive comments makes you feel uncomfortable.
The following quote might appear to conflict with the goal of EA, but I think it doesn’t and makes and important point.
“There is no hierarchy of compassionate action. Based on our interests, skills and what truly moves us, we each find our own way, helping to alleviate suffering in whatever way we can.”—Joseph Goldstein (2007) in A Heart Full of Peace
What we are trying to do is called Effective Altruism, not Altruistic Effectiveness, and we should be trying to be altruistic in the first place, that is, good and caring people.[1]
The idea of focusing on consequences
I also think that an exaggerated focus on consequences can be misleading in a social context, as well as detrimental in terms of personal well-being. Even if one supports consequentialism, focusing on consequences may not be the best strategy for achieving them.
One reason is that, as Stoic philosophy tells us, we can’t control the outcomes of our actions.
Another is that if we cling to them, they can distract us from what it means to live an ethical life. When we focus on consequences rather than valuing effort and intention, what it means to be a good person is subject to considerable moral luck.
I think this focus on consequences, which is widespread in EA, can also lead to unhealthy social dynamics. One might be tempted to value effectiveness over kindness, intelligence over caring. I have heard some people say something like “well, I am not that impactful”, with a painful touch of imposter syndrome, comparing themselves to other people who have done more impressive things within the EA movement.
Even when we try to see each other as a team working towards a common goal of making the world a better place, we often can’t help but form opinions of other people by judging them based on what we value. We are animals who play status games, even when we may not like the idea. So our shared values, what we care about, are important to our community. And I think that these ideas, effectiveness and a focus on consequences, if overly endorsed, can make a community less welcoming and less warm-hearted. This starts with whether we frame and explain EA in the abstract as “trying to do good as effectively as possible” or as “trying to help people and animals as well as we can”, and ends with the values we consider most important.
Conclusions
In conclusion, I think that rather than being overly focused on finding the most effective means of doing good, we should also be concerned with becoming more altruistic, caring and compassionate. We should not neglect to care about things that are difficult or impossible to measure by focusing on reason and rationality. We must remain humble, grounded and warm-hearted. Even when dealing with important technical issues such as AI safety or other areas of long-term concern (and especially in those cases), it is important to nurture human qualities and not allow them to recede into the background. I’m not trying to devalue these causes, but they might become an easy way to rationalise away and disconnect from the suffering that is happening right now. Because it can be painful to open up to and connect with suffering, it can be tempting to find a good reason not to.
EA isn’t about turning off the heart because it might lead to bias, it’s about turning on the head. And when they are in conflict, we need to stop and think carefully about what price we might pay in trying to be as effective as possible. Good and caring actions that may seem ineffective can be very important for reasons that we cannot put into a metric.
We can and should learn not only how to think more rationally, but also how to become more caring, for compassion is a skill we can train. This quality of caring, the intention to help, is the soil on which EA can grow. It is linked to our personal and interpersonal well-being and we should emphasise the importance of nurturing it rather than depleting it by becoming too attached to certain ideas.[2][3][4][5]
A few resources on practicing compassion
Practicing mindfulness (A wandering mind is a less caring mind[6])
Practicing to care (e.g. through loving-kindness meditation)[7][8]
There are other resources available, such as from the CCARE at Stanford
References and acknowledgements
I like to thank Ysa Bourgine for her reflections and ideas that contributed to this post and everyone else with whom I had good conversations about these topics.
I took this expression from Emil Wasteson
Jazaieri H, Jinpa GT, McGonigal K, et al. Enhancing Compassion: A Randomized Controlled Trial of a Compassion Cultivation Training Program. J Happiness Stud. 2013;14(4):1113-1126. doi:10.1007/S10902-012-9373-Z/TABLES/3
Jazaieri H, McGonigal K, Lee IA, et al. Altering the Trajectory of Affect and Affect Regulation: the Impact of Compassion Training. Mindfulness (N Y). 2018;9(1):283-293. doi:10.1007/S12671-017-0773-3/FIGURES/5
Quaglia JT, Soisson A, Simmer-Brown J. Compassion for self versus other: A critical review of compassion training research. J Posit Psychol. 2021;16(5):675-690. doi:10.1080/17439760.2020.1805502
Klimecki OM, Leiberg S, Ricard M, Singer T. Differential pattern of functional brain plasticity after compassion and empathy training. Soc Cogn Affect Neurosci. 2014;9(6):873-879. doi:10.1093/SCAN/NST060
Jazaieri H, Lee IA, McGonigal K, et al. A wandering mind is a less caring mind: Daily experience sampling during compassion meditation training. J Posit Psychol. 2016;11(1):37-50. doi:10.1080/17439760.2015.1025418
Hutcherson CA, Seppala EM, Gross JJ. Loving-Kindness Meditation Increases Social Connectedness. Emotion. 2008;8(5):720-724. doi:10.1037/A0013237
Fredrickson BL, Boulton AJ, Firestine AM, et al. Positive Emotion Correlates of Meditation Practice: a Comparison of Mindfulness Meditation and Loving-Kindness Meditation. Mindfulness (N Y). 2017;8(6):1623-1633. doi:10.1007/S12671-017-0735-9/TABLES/2