You’ve laid out your opinions clearly. It is well cited, and has interesting and informative accompanying sources. It’s a good post. However, I disagree with some portions of the underlying attitudes, (even while not particularly objecting to some of the recommended methods)
In an ideal world where all people are rational, the ideas mentioned in this forum post would be completely useless.
The thing is, this is a purely inside view. It sort of presupposes effective altruist ideas are correct, and that the only barrier to widespread adoption is irrationality, rather than any sensible sort of skepticism.
While humans can be irrational in distributing status, there is such a thing as legitimately earned status. If we put on our idealist hats for just a moment and forget all the extremely silly things humans accord status to, status can represent the “outside view”—if institutions we respect seem to respect EA, that should increase our confidence in EA ideas. Not because we’re status climbing apes, but because “capable of convincing me” shouldn’t be a person’s only bar for trusting an argument. One should sensibly understands the limited scope of ones own judgement regarding big topics.
Now, taking our idealist hats off, obviously we can’t just trust what most people think, or consider all “high status” institutions as equally legitimate. We have to be discerning. But there are institutions (such as academia, in my opinion) whose approval matters because it functions as legitimate external validation. It’s not just social currency, it’s a well earned social currency. Not only that, it’s an opportunity to send our good ideas elsewhere to develop and mutate, as well as an opportunity to allow our bad ideas to be culled.
Unfortunately, people often are much less rational than we’d like to admit. Acknowledging this might be a pragmatic way for EA to improve outreach effectiveness.
The other issue is that when one is forming a broad, high level strategy for engaging in the world, it should feel good. The words one uses should make one feel warm inside, not exasperated at the irrationality of the world and the necessity of stooping to slimy feeling methods to win. Lest anyone irrationally (/s) dismiss this as a “warm fuzzy altruism”, in Bosch’s linked taxonomy, let me pragmatically (/s) employ an appeal to authority: Yudkowsky has made the same point. If it feels cynical and a touch Machiavellian, it usually will not ultimately produce morally wholesome results. Personally, I think if you want to really convince people, you shouldn’t use methods that would make them feel like you tricked them if they knew what you were doing.
Not to mention...it’s just sort of impractical for EA to attempt “we know you are irrational and we’re not above pushing your irrationality buttons” strategies. EA organizations are generally scrupulous about transparency so that we can hold each other accountable. This means that any cynical outreach attempts will be transparent as well. In general my sense is that idealist institutions can’t effectively wield some of these more cynical methods.
Also as a sort of aside, I don’t think there’s anything irrational about appealing to emotions. The key is to appeal to emotions in a way that we bring out behavior which is a true expression of people’s values. Often, when someone has a “bad” ideology, it is emotions of compassion that bring them out of it. Learning to better engage people on an emotional level is not in any way opposed to presenting logical and rational cases for things.
How can EA help people increase their status?
...in a non-cynical way?
By acquiring well-earned legitimacy! Make real positive impacts in areas other people care about. That means you can also help individual effective altruists make real measurable impacts that they can put on their resume and thereby increase their career capital. Create arguments that other intellectuals agree with and cite. Mentor other people and give them skills. Create mechanisms for people to be public about their donations and personal sacrifices they might make to further a cause in a socially graceful way (it inspires others to do the same). These are all things that the Effective Altruist community is currently doing, and it’s been working regardless of whether or not people are wearing suits.
What all these methods have in common is that they work with people’s rationality (and true altruistic motives), rather than workaround their irrationality (and hidden selfish motives)- these are methods that encourage involvement with EA because people are convinced that them personally being involved with EA involvement will help further their (altruistic, but also otherwise) goals. The status raising effects in these methods are secondary to real accomplishment, they put forth honest signals of competence and skill, which the larger society recognizes because it is actually valuable. The appeals to emotion work via being connected to the reality of actually accomplishing the tasks that those emotions are oriented towards.
So, I would generally agree with your call for EAs to think about more ways to gain legitimacy. I just want to strongly prioritized well-earned legitimacy...whereas this post comes off as though it’s largely about gaining less legitimate forms of status. (Perhaps due to an implicit feeling that all status is illegitimate?)
It seems that we disagree about to what extent people’s motivation to pursue status (well-earned or not) guides our behavior—I don’t think the status raising effects are secondary to real accomplishment, but I think that the status raising effects are an important underlying reason in our pursuit to accomplish anything at all.
I agree that some ways of receiving status are more legitimate than others, that it’s important for EA to focus on legitimate status, and that it’s more important to have a good argument than to wear a suit. But because all humans are also (and maybe even above all) status-climbing apes, I think that EA’s pursuit of achieving legitimate status is affected by content-irrelevant elements. This is why I think it might not be best to view legitimately earned status in isolation from the more irrational parts of status, but to rather see how these two interact.
You mentioned that EA could help people increase their status in a non-cynical way, like helping individual effective altruists make measurable impacts, or creating arguments that other intellectuals agree with and cite, and I agree that these are important ways people could increase their status. However, I think they don’t contradict with the ways of increasing status I mentioned in the post. Different methods might differ in to what extent they rely on content-irrelevant status-increasing elements, but in my opinion, we can never fully disregard these more irrational aspects of why people regard something high-status. In the post I tried to emphasize that EA might consider increasing using the strategies that rely on the content-irrelevant status-increasing elements to a larger extent. That is because I think EA right now is overly cautious about using them and as a result, might miss out on reaching out to people valuable to EA’s cause.
I think that finding the right kind of “packaging” for EA’s content (while not changing the content) is useful when reaching out to all audiences and that this can help make outreach messages more clear and inviting for people without having them feel like they have been tricked into believing something.
You’ve laid out your opinions clearly. It is well cited, and has interesting and informative accompanying sources. It’s a good post. However, I disagree with some portions of the underlying attitudes, (even while not particularly objecting to some of the recommended methods)
The thing is, this is a purely inside view. It sort of presupposes effective altruist ideas are correct, and that the only barrier to widespread adoption is irrationality, rather than any sensible sort of skepticism.
While humans can be irrational in distributing status, there is such a thing as legitimately earned status. If we put on our idealist hats for just a moment and forget all the extremely silly things humans accord status to, status can represent the “outside view”—if institutions we respect seem to respect EA, that should increase our confidence in EA ideas. Not because we’re status climbing apes, but because “capable of convincing me” shouldn’t be a person’s only bar for trusting an argument. One should sensibly understands the limited scope of ones own judgement regarding big topics.
Now, taking our idealist hats off, obviously we can’t just trust what most people think, or consider all “high status” institutions as equally legitimate. We have to be discerning. But there are institutions (such as academia, in my opinion) whose approval matters because it functions as legitimate external validation. It’s not just social currency, it’s a well earned social currency. Not only that, it’s an opportunity to send our good ideas elsewhere to develop and mutate, as well as an opportunity to allow our bad ideas to be culled.
The other issue is that when one is forming a broad, high level strategy for engaging in the world, it should feel good. The words one uses should make one feel warm inside, not exasperated at the irrationality of the world and the necessity of stooping to slimy feeling methods to win. Lest anyone irrationally (/s) dismiss this as a “warm fuzzy altruism”, in Bosch’s linked taxonomy, let me pragmatically (/s) employ an appeal to authority: Yudkowsky has made the same point. If it feels cynical and a touch Machiavellian, it usually will not ultimately produce morally wholesome results. Personally, I think if you want to really convince people, you shouldn’t use methods that would make them feel like you tricked them if they knew what you were doing.
Not to mention...it’s just sort of impractical for EA to attempt “we know you are irrational and we’re not above pushing your irrationality buttons” strategies. EA organizations are generally scrupulous about transparency so that we can hold each other accountable. This means that any cynical outreach attempts will be transparent as well. In general my sense is that idealist institutions can’t effectively wield some of these more cynical methods.
Also as a sort of aside, I don’t think there’s anything irrational about appealing to emotions. The key is to appeal to emotions in a way that we bring out behavior which is a true expression of people’s values. Often, when someone has a “bad” ideology, it is emotions of compassion that bring them out of it. Learning to better engage people on an emotional level is not in any way opposed to presenting logical and rational cases for things.
How can EA help people increase their status?
...in a non-cynical way?
By acquiring well-earned legitimacy! Make real positive impacts in areas other people care about. That means you can also help individual effective altruists make real measurable impacts that they can put on their resume and thereby increase their career capital. Create arguments that other intellectuals agree with and cite. Mentor other people and give them skills. Create mechanisms for people to be public about their donations and personal sacrifices they might make to further a cause in a socially graceful way (it inspires others to do the same). These are all things that the Effective Altruist community is currently doing, and it’s been working regardless of whether or not people are wearing suits.
What all these methods have in common is that they work with people’s rationality (and true altruistic motives), rather than work around their irrationality (and hidden selfish motives)- these are methods that encourage involvement with EA because people are convinced that them personally being involved with EA involvement will help further their (altruistic, but also otherwise) goals. The status raising effects in these methods are secondary to real accomplishment, they put forth honest signals of competence and skill, which the larger society recognizes because it is actually valuable. The appeals to emotion work via being connected to the reality of actually accomplishing the tasks that those emotions are oriented towards.
So, I would generally agree with your call for EAs to think about more ways to gain legitimacy. I just want to strongly prioritized well-earned legitimacy...whereas this post comes off as though it’s largely about gaining less legitimate forms of status. (Perhaps due to an implicit feeling that all status is illegitimate?)
It seems that we disagree about to what extent people’s motivation to pursue status (well-earned or not) guides our behavior—I don’t think the status raising effects are secondary to real accomplishment, but I think that the status raising effects are an important underlying reason in our pursuit to accomplish anything at all.
I agree that some ways of receiving status are more legitimate than others, that it’s important for EA to focus on legitimate status, and that it’s more important to have a good argument than to wear a suit. But because all humans are also (and maybe even above all) status-climbing apes, I think that EA’s pursuit of achieving legitimate status is affected by content-irrelevant elements. This is why I think it might not be best to view legitimately earned status in isolation from the more irrational parts of status, but to rather see how these two interact.
You mentioned that EA could help people increase their status in a non-cynical way, like helping individual effective altruists make measurable impacts, or creating arguments that other intellectuals agree with and cite, and I agree that these are important ways people could increase their status. However, I think they don’t contradict with the ways of increasing status I mentioned in the post. Different methods might differ in to what extent they rely on content-irrelevant status-increasing elements, but in my opinion, we can never fully disregard these more irrational aspects of why people regard something high-status. In the post I tried to emphasize that EA might consider increasing using the strategies that rely on the content-irrelevant status-increasing elements to a larger extent. That is because I think EA right now is overly cautious about using them and as a result, might miss out on reaching out to people valuable to EA’s cause.
I think that finding the right kind of “packaging” for EA’s content (while not changing the content) is useful when reaching out to all audiences and that this can help make outreach messages more clear and inviting for people without having them feel like they have been tricked into believing something.