I strongly agree with many points in the article. But I think the main think I want to call into question, though not necessarily disprove, is the thesis that “good associations” is equal to “warm words”. And even going so far as suggesting that warm words might not be a good idea at all for Effective Altruism.
A good case to compare ourselves to here is Science, and its associative perception. Science is famous for lacking warm words in its language. If you go to the page of a research institute, or a scientific journal, you will be bombarded with cold and analytic words. The government is another interesting reference class. It uses no warm words, and avoided doing so from the very beginning. But Science and Government are probably the two most successful institutions that have arisen in the last thousand years. How come both of these institutions are so successful, even though they completely lack warmth in their presentation?
It is true that warm words can create positive associations, but so can cold words. The space of associations is large, and “warmth” is only one positive attribute that you can use. Equally important dimensions are “growth”, “stability”, “authority”, “honesty”, “wealth”, “reliability” and “consistency” (and many more). An analysis that argues for using more warm words needs to make a more precise claim about why we should use warmth, in particular if warmth is often hard to combine with authority, ambitiousness and objectivity (as other commenters have noticed).
The second thing I want to highlight is the question of “what kind of person do we want to attract?”. As EA, our goal is not to just grow, it is to create a healthy ecosystem in which ideas can thrive, projects can be started, and the climate of discussion is generally allowed to be friendly and at an intellectual high level (among many other things). In creating this ecosystem, the question of “will this keep some people from joining the movement?” is comparatively unimportant to the question of “how can we get the people that the current movement is lacking?”.
And so I pose the question, “does using warm language, attract any groups of people that would significantly improve the health of the EA ecosystem?”. To answer this question, we need to understand what kind of person is attracted by warm and compassionate language. And we need to understand what kinds of people we want to have more of in EA. To answer these sub-questions, we might want to start with looking at existing communities that use a warmer language than we do, and see whether having members migrate from there, and enter our community, would be a net improvement for the EA community.
This analysis would require a bit more space than I have in this comment, but looking at it from the outside, it isn’t immediately clear to me that the communities that are known for warmth would be particularly valuable for EA. That said, communities that are slightly more associated with warmth, such as the education, biology and social science community, are indeed groups in which EA appears to be lacking, and might be valuable contributions to the EA ecosystem. It is not clear to me that growth from the social impact sector would be a strong improvement, which is also a community that emphasizes warmth more than we do (and would be compatible with other EA ideas).
In conclusion, I mostly want to highlight that using more warm language is not clearly a good choice, and might come with higher costs than naively expected. To answer that question I would love to see more analysis along the lines of this post, and this comment, by the broader EA community. I also want to emphasize that the language that we are using is a really important choice, and that almost every decision in this domain comes with tradeoffs. Emphasizing warmth will almost always mean a lower emphasis on the other attributes that we were highlighting previously. We need to be aware what those tradeoffs are, and choose our signaling carefully and consciously according to an analysis of what community we want to build. This is a difficult task, but with large potential payoffs.
This comment is very smart and important. You made me think a lot.
In the case of science, I think the example isn’t good for your point, but your point is perfect nonetheless. I will simplify, but the reason that I think that the example doesn’t work is because science has lost its battle with religion (warm language) for centuries, and only started gaining ground because it started producing really useful things. Religion produces very little and it still manages to put up a fight against Science. Science is successful despite the fact that it’s ″cold″ and counterintuitive, simply because it’s so useful. (In fact, science becomes more “persuasive” for the general public when popular intellectuals like Dawkins, Tyson or Feynman add a warm poetic spin to it).
“An analysis that argues for using more warm words needs to make a more precise claim about why we should use warmth, in particular if warmth is often hard to combine with authority, ambitiousness and objectivity (as other commenters have noticed).”
The reason that I argue in favor of using a warmer language, without giving up on rationality, is because some of our ideas can make us seem cold, while usually the opposite is true. I think that seeming heartless is a specific weakness of the EA movement, for some of the reasons that I elaborated on the article. There are a lot of critiques of effective altruism that constantly imply that effective altruists are cold.
We can cover this weakness by putting a greater effort in communicating our feelings and showing our enthusiasm. I don’t think we lose anything important by doing this, but I’m really open about it. It would be problematic if it draws undesirable people, but by retaining our emphasis in rationality that problem would be mostly solved.
All your concerns are very interesting and important, and I would definitely like to see more discussion about this topic. We are in a stage in which It’s really important to get these things right.
Edit: Thanks to this comment, I’ve been thinking more and more about the importance of showing strength, consistency and determination to achieve our goals. This was something that I really liked about Yudkowsky’s texts: He makes a great effort to transmit how important his goal is to him and why we should keep pushing forward and improving.
I’ve been writing a lot of roughly 500-word announcements for charity events trying to combine at least these two advantages. What I like to do there is to use warm language for the first and last sentence (the call to action) and write “normally” (for me) in the center. The idea is that when people are skimming they’ll read the first sentence or parts of it and will react to it more with System 1 than 2 to decide whether they want to read the rest, and if they do decide to read the rest, then I can trust them to assess it by the merits of its content, maybe.
Firstly, thanks for the post above! These are important questions to consider.
I think your main point in your post is that the misperception of EAs as cold is preventing growth, and that’s why we’d want to correct it. Habryka replied that what really matters is ‘are we growing the EA ecosystem in the right way?’. In your response to him, you say that you argue for warmer language because it corrects a false perception of us and that it’s a common point of criticism.
But to reply to Habryka, a clearer argument is needed for saying why those things matter. It could be the case that we are warm, our language falsely makes us seem cold, but this isn’t a problem because it doesn’t adversely affect growing the EA ecosystem in a healthy way, even if there are people turned off by it.
Also, it might not be practical or worthwhile to defend against every criticism based on misrepresentations. This critique might not even be the critic’s true rejection of EA. Defend against that one, and they’ll generate new critiques based on other distortions of who we are. Is this a critique in particular which we need to defend against because it’s damaging us, worse than the next critique they’ll focus on?
Yes, totally. The next post was going to consist of some ideas about the critics true rejections and how to deal with it.
The question about what would be a healthy EA ecosystem is really interesting and worth exploring. Somebody should write more about it. I may eventually do it.
My current intuition is that we need more people from diverse fields of knowledge and with diverse skills, since they can contribute to EA in unique ways, apart from donating. To gain this benefit, I think that it’s worth losing a bit in other regards if we have to. I will think more about it though.
I strongly agree with many points in the article. But I think the main think I want to call into question, though not necessarily disprove, is the thesis that “good associations” is equal to “warm words”. And even going so far as suggesting that warm words might not be a good idea at all for Effective Altruism.
A good case to compare ourselves to here is Science, and its associative perception. Science is famous for lacking warm words in its language. If you go to the page of a research institute, or a scientific journal, you will be bombarded with cold and analytic words. The government is another interesting reference class. It uses no warm words, and avoided doing so from the very beginning. But Science and Government are probably the two most successful institutions that have arisen in the last thousand years. How come both of these institutions are so successful, even though they completely lack warmth in their presentation?
It is true that warm words can create positive associations, but so can cold words. The space of associations is large, and “warmth” is only one positive attribute that you can use. Equally important dimensions are “growth”, “stability”, “authority”, “honesty”, “wealth”, “reliability” and “consistency” (and many more). An analysis that argues for using more warm words needs to make a more precise claim about why we should use warmth, in particular if warmth is often hard to combine with authority, ambitiousness and objectivity (as other commenters have noticed).
The second thing I want to highlight is the question of “what kind of person do we want to attract?”. As EA, our goal is not to just grow, it is to create a healthy ecosystem in which ideas can thrive, projects can be started, and the climate of discussion is generally allowed to be friendly and at an intellectual high level (among many other things). In creating this ecosystem, the question of “will this keep some people from joining the movement?” is comparatively unimportant to the question of “how can we get the people that the current movement is lacking?”.
And so I pose the question, “does using warm language, attract any groups of people that would significantly improve the health of the EA ecosystem?”. To answer this question, we need to understand what kind of person is attracted by warm and compassionate language. And we need to understand what kinds of people we want to have more of in EA. To answer these sub-questions, we might want to start with looking at existing communities that use a warmer language than we do, and see whether having members migrate from there, and enter our community, would be a net improvement for the EA community.
This analysis would require a bit more space than I have in this comment, but looking at it from the outside, it isn’t immediately clear to me that the communities that are known for warmth would be particularly valuable for EA. That said, communities that are slightly more associated with warmth, such as the education, biology and social science community, are indeed groups in which EA appears to be lacking, and might be valuable contributions to the EA ecosystem. It is not clear to me that growth from the social impact sector would be a strong improvement, which is also a community that emphasizes warmth more than we do (and would be compatible with other EA ideas).
In conclusion, I mostly want to highlight that using more warm language is not clearly a good choice, and might come with higher costs than naively expected. To answer that question I would love to see more analysis along the lines of this post, and this comment, by the broader EA community. I also want to emphasize that the language that we are using is a really important choice, and that almost every decision in this domain comes with tradeoffs. Emphasizing warmth will almost always mean a lower emphasis on the other attributes that we were highlighting previously. We need to be aware what those tradeoffs are, and choose our signaling carefully and consciously according to an analysis of what community we want to build. This is a difficult task, but with large potential payoffs.
This comment is very smart and important. You made me think a lot.
In the case of science, I think the example isn’t good for your point, but your point is perfect nonetheless. I will simplify, but the reason that I think that the example doesn’t work is because science has lost its battle with religion (warm language) for centuries, and only started gaining ground because it started producing really useful things. Religion produces very little and it still manages to put up a fight against Science. Science is successful despite the fact that it’s ″cold″ and counterintuitive, simply because it’s so useful. (In fact, science becomes more “persuasive” for the general public when popular intellectuals like Dawkins, Tyson or Feynman add a warm poetic spin to it).
“An analysis that argues for using more warm words needs to make a more precise claim about why we should use warmth, in particular if warmth is often hard to combine with authority, ambitiousness and objectivity (as other commenters have noticed).”
The reason that I argue in favor of using a warmer language, without giving up on rationality, is because some of our ideas can make us seem cold, while usually the opposite is true. I think that seeming heartless is a specific weakness of the EA movement, for some of the reasons that I elaborated on the article. There are a lot of critiques of effective altruism that constantly imply that effective altruists are cold.
We can cover this weakness by putting a greater effort in communicating our feelings and showing our enthusiasm. I don’t think we lose anything important by doing this, but I’m really open about it. It would be problematic if it draws undesirable people, but by retaining our emphasis in rationality that problem would be mostly solved.
All your concerns are very interesting and important, and I would definitely like to see more discussion about this topic. We are in a stage in which It’s really important to get these things right.
Edit: Thanks to this comment, I’ve been thinking more and more about the importance of showing strength, consistency and determination to achieve our goals. This was something that I really liked about Yudkowsky’s texts: He makes a great effort to transmit how important his goal is to him and why we should keep pushing forward and improving.
I’ve been writing a lot of roughly 500-word announcements for charity events trying to combine at least these two advantages. What I like to do there is to use warm language for the first and last sentence (the call to action) and write “normally” (for me) in the center. The idea is that when people are skimming they’ll read the first sentence or parts of it and will react to it more with System 1 than 2 to decide whether they want to read the rest, and if they do decide to read the rest, then I can trust them to assess it by the merits of its content, maybe.
Firstly, thanks for the post above! These are important questions to consider.
I think your main point in your post is that the misperception of EAs as cold is preventing growth, and that’s why we’d want to correct it. Habryka replied that what really matters is ‘are we growing the EA ecosystem in the right way?’. In your response to him, you say that you argue for warmer language because it corrects a false perception of us and that it’s a common point of criticism.
But to reply to Habryka, a clearer argument is needed for saying why those things matter. It could be the case that we are warm, our language falsely makes us seem cold, but this isn’t a problem because it doesn’t adversely affect growing the EA ecosystem in a healthy way, even if there are people turned off by it.
Also, it might not be practical or worthwhile to defend against every criticism based on misrepresentations. This critique might not even be the critic’s true rejection of EA. Defend against that one, and they’ll generate new critiques based on other distortions of who we are. Is this a critique in particular which we need to defend against because it’s damaging us, worse than the next critique they’ll focus on?
Yes, totally. The next post was going to consist of some ideas about the critics true rejections and how to deal with it.
The question about what would be a healthy EA ecosystem is really interesting and worth exploring. Somebody should write more about it. I may eventually do it.
My current intuition is that we need more people from diverse fields of knowledge and with diverse skills, since they can contribute to EA in unique ways, apart from donating. To gain this benefit, I think that it’s worth losing a bit in other regards if we have to. I will think more about it though.