This comment is very smart and important. You made me think a lot.
In the case of science, I think the example isn’t good for your point, but your point is perfect nonetheless. I will simplify, but the reason that I think that the example doesn’t work is because science has lost its battle with religion (warm language) for centuries, and only started gaining ground because it started producing really useful things. Religion produces very little and it still manages to put up a fight against Science. Science is successful despite the fact that it’s ″cold″ and counterintuitive, simply because it’s so useful. (In fact, science becomes more “persuasive” for the general public when popular intellectuals like Dawkins, Tyson or Feynman add a warm poetic spin to it).
“An analysis that argues for using more warm words needs to make a more precise claim about why we should use warmth, in particular if warmth is often hard to combine with authority, ambitiousness and objectivity (as other commenters have noticed).”
The reason that I argue in favor of using a warmer language, without giving up on rationality, is because some of our ideas can make us seem cold, while usually the opposite is true. I think that seeming heartless is a specific weakness of the EA movement, for some of the reasons that I elaborated on the article. There are a lot of critiques of effective altruism that constantly imply that effective altruists are cold.
We can cover this weakness by putting a greater effort in communicating our feelings and showing our enthusiasm. I don’t think we lose anything important by doing this, but I’m really open about it. It would be problematic if it draws undesirable people, but by retaining our emphasis in rationality that problem would be mostly solved.
All your concerns are very interesting and important, and I would definitely like to see more discussion about this topic. We are in a stage in which It’s really important to get these things right.
Edit: Thanks to this comment, I’ve been thinking more and more about the importance of showing strength, consistency and determination to achieve our goals. This was something that I really liked about Yudkowsky’s texts: He makes a great effort to transmit how important his goal is to him and why we should keep pushing forward and improving.
I’ve been writing a lot of roughly 500-word announcements for charity events trying to combine at least these two advantages. What I like to do there is to use warm language for the first and last sentence (the call to action) and write “normally” (for me) in the center. The idea is that when people are skimming they’ll read the first sentence or parts of it and will react to it more with System 1 than 2 to decide whether they want to read the rest, and if they do decide to read the rest, then I can trust them to assess it by the merits of its content, maybe.
Firstly, thanks for the post above! These are important questions to consider.
I think your main point in your post is that the misperception of EAs as cold is preventing growth, and that’s why we’d want to correct it. Habryka replied that what really matters is ‘are we growing the EA ecosystem in the right way?’. In your response to him, you say that you argue for warmer language because it corrects a false perception of us and that it’s a common point of criticism.
But to reply to Habryka, a clearer argument is needed for saying why those things matter. It could be the case that we are warm, our language falsely makes us seem cold, but this isn’t a problem because it doesn’t adversely affect growing the EA ecosystem in a healthy way, even if there are people turned off by it.
Also, it might not be practical or worthwhile to defend against every criticism based on misrepresentations. This critique might not even be the critic’s true rejection of EA. Defend against that one, and they’ll generate new critiques based on other distortions of who we are. Is this a critique in particular which we need to defend against because it’s damaging us, worse than the next critique they’ll focus on?
Yes, totally. The next post was going to consist of some ideas about the critics true rejections and how to deal with it.
The question about what would be a healthy EA ecosystem is really interesting and worth exploring. Somebody should write more about it. I may eventually do it.
My current intuition is that we need more people from diverse fields of knowledge and with diverse skills, since they can contribute to EA in unique ways, apart from donating. To gain this benefit, I think that it’s worth losing a bit in other regards if we have to. I will think more about it though.
This comment is very smart and important. You made me think a lot.
In the case of science, I think the example isn’t good for your point, but your point is perfect nonetheless. I will simplify, but the reason that I think that the example doesn’t work is because science has lost its battle with religion (warm language) for centuries, and only started gaining ground because it started producing really useful things. Religion produces very little and it still manages to put up a fight against Science. Science is successful despite the fact that it’s ″cold″ and counterintuitive, simply because it’s so useful. (In fact, science becomes more “persuasive” for the general public when popular intellectuals like Dawkins, Tyson or Feynman add a warm poetic spin to it).
“An analysis that argues for using more warm words needs to make a more precise claim about why we should use warmth, in particular if warmth is often hard to combine with authority, ambitiousness and objectivity (as other commenters have noticed).”
The reason that I argue in favor of using a warmer language, without giving up on rationality, is because some of our ideas can make us seem cold, while usually the opposite is true. I think that seeming heartless is a specific weakness of the EA movement, for some of the reasons that I elaborated on the article. There are a lot of critiques of effective altruism that constantly imply that effective altruists are cold.
We can cover this weakness by putting a greater effort in communicating our feelings and showing our enthusiasm. I don’t think we lose anything important by doing this, but I’m really open about it. It would be problematic if it draws undesirable people, but by retaining our emphasis in rationality that problem would be mostly solved.
All your concerns are very interesting and important, and I would definitely like to see more discussion about this topic. We are in a stage in which It’s really important to get these things right.
Edit: Thanks to this comment, I’ve been thinking more and more about the importance of showing strength, consistency and determination to achieve our goals. This was something that I really liked about Yudkowsky’s texts: He makes a great effort to transmit how important his goal is to him and why we should keep pushing forward and improving.
I’ve been writing a lot of roughly 500-word announcements for charity events trying to combine at least these two advantages. What I like to do there is to use warm language for the first and last sentence (the call to action) and write “normally” (for me) in the center. The idea is that when people are skimming they’ll read the first sentence or parts of it and will react to it more with System 1 than 2 to decide whether they want to read the rest, and if they do decide to read the rest, then I can trust them to assess it by the merits of its content, maybe.
Firstly, thanks for the post above! These are important questions to consider.
I think your main point in your post is that the misperception of EAs as cold is preventing growth, and that’s why we’d want to correct it. Habryka replied that what really matters is ‘are we growing the EA ecosystem in the right way?’. In your response to him, you say that you argue for warmer language because it corrects a false perception of us and that it’s a common point of criticism.
But to reply to Habryka, a clearer argument is needed for saying why those things matter. It could be the case that we are warm, our language falsely makes us seem cold, but this isn’t a problem because it doesn’t adversely affect growing the EA ecosystem in a healthy way, even if there are people turned off by it.
Also, it might not be practical or worthwhile to defend against every criticism based on misrepresentations. This critique might not even be the critic’s true rejection of EA. Defend against that one, and they’ll generate new critiques based on other distortions of who we are. Is this a critique in particular which we need to defend against because it’s damaging us, worse than the next critique they’ll focus on?
Yes, totally. The next post was going to consist of some ideas about the critics true rejections and how to deal with it.
The question about what would be a healthy EA ecosystem is really interesting and worth exploring. Somebody should write more about it. I may eventually do it.
My current intuition is that we need more people from diverse fields of knowledge and with diverse skills, since they can contribute to EA in unique ways, apart from donating. To gain this benefit, I think that it’s worth losing a bit in other regards if we have to. I will think more about it though.