I don’t have relevant data nor have I thought very systematically about this, but my intuition is to strongly agree with basically everything you say.
In particular, I feel that the “Having exposure to a diverse range of perspectives and experiences is generally valuable.” squares fairly well with my own experience. There just are so many moving parts to how communities and organizations work—how to moderate meetings, how to give feedback, how much hierarchies and structure to have etc. etc. - that I think it’s fairly hard to even be aware of the full space of options (and impossible to experiment with a non-negligible fraction of it). Having an influx of people with diverse experiences in that respect can massively multiply the amount of information available on these intangible things. This seems particularly valuable to EA to me because I feel that relative to the community’s size there’s an unusual amount of conformity on these things within EA, perhaps due to the tight social connections within the community and the outsized influence of certain ‘cultural icons’.
Personally, I feel that I’ve learned a lot of the (both intellectual and interpersonal) skills that are most useful in my work right now outside of EA, and in fact that outside of EA’s core focus (roughly, what are the practical implications of ‘sufficiently consequentialist’ ethics) I’ve learned surprisingly little in EA even after correcting for only having been in the community for a small fraction of my life.
(Perhaps more controversially, I think this also applies to the epistemic rather than the purely cultural or organizational domain: I.e. my claim roughly is that things like phrasing lots of statement in terms of probabilities, having discussions mostly in Google docs vs. in person, the kind of people one circulates drafts to, how often one is forced to face a situation where one has to explain one’s thoughts to people one has never met before, and various small things like that affect the overall epistemic process in messy ways that are hard to track or anticipate other than by actually having experienced how several alternatives play out.)
Similar to Milan I agree with the main point of your comment and also think that the EA community conforms less than the majority of communities.
Maybe ironically, I also think that there is a relative lack of experience with communities in general among a lot of people interested in EA, which makes it harder for people to know what is expected, such as using group slang, strong identities, close connections and group ‘rituals’ which are very common in most communities.
Thank you, your comment made me realize both that I maybe wasn’t quite aware what meaning and connotations ‘community’ has for native speakers, and maybe that I was implicitly comparing EA against groups that aren’t a community in that sense. I guess it’s also quite unclear to me if I think it’s good for EA to be a community in this sense.
I feel that relative to the community’s size there’s an unusual amount of conformity on these things within EA
Nitpick: probably not? e.g. compare to US social justice or US social conservatism, which are much larger movements (EA probably < 100,000 total; both of those probably ~ 500,00-10 mil total depending on who you count) and seem to be much more ideologically conforming.
Hmm, thanks for sharing your impression, I think talking about specific examples is often very useful to spot disagreements and help people learn from each other.
I’ve never lived in the US or otherwise participated in one of these communities, so I can’t tell from first-hand experience. But my loose impression is that there have been substantial disagreements both synchronically and diachronically within those movements; for example, in social justice about trans* issues or sex work, and in conservatism about interventionist vs. isolationist foreign policy, to name but a few examples. Of course, EAs disagree substantially about, say, their favored cause area. But my impression at least is that disagreements within those other movements can be much more acrimonious (jtbc, I think it’s mostly good that we don’t have this in EA), and also that the difference in ‘cultural vibe’ I would get from attending, say, a Black Lives Matters grassroots group meeting vs. a meeting of the Hilary Clinton presidential campaign team is larger than the one between the local EA group in Harvard and the EA Leaders Forum. Do your impressions of these things differ, or were you thinking of other manifestations of conformity?
(Maybe that’s comparing apples to oranges because a much larger proportion of EAs are from privileged backgrounds and in their 20s, and if one ‘controlled’ social justice and conservatism for these demographic factors they’d be closer to EA levels of conformity. OTOH maybe it’s something about EA that contributes to causing this demographic narrowness.)
Also, we have an explanation for the conformity within social justice and conservatism that on some readings might rationalize this conformity—namely Haidt’s moral foundations theory. To put it crudely, given that you’re motivated by fairness and care but not authority etc. maybe it just is rational to hold the ‘liberal’ bundle of views. (I think that’s true only to a limited but still significant extent, and also maybe that the story for why the mistakes reflected by the non-rational parts are so correlated is different from the one for EA in an interesting way.) By contrast, I’m not sure there is a similarly rationalizing explanation for why many EAs agree on both (i) there’s a moral imperative for cost-effectiveness, and (ii) you should one-box in Newcomb’s problem, and for why many know more about cognitive biases than about the leading theories for why the Industrial Revolution started in Europe rather than China.
Do your impressions of these things differ, or were you thinking of other manifestations of conformity?
I think the cultural vibe you would get at a Dank EA Memes meetup (e.g. “Dank EA Global 2018”) would be pretty different from the vibe at a Leverage meetup, and both of those pretty different from the vibe at a GiveWell happy hour.
Agree that there is likely more acrimony in social justice communities than in EA. I actually think this flows from their conformity, as I think there’s a lot of pressure to virtue signal & a lot of calling out when a person / group hasn’t virtue signaled sufficiently (for whatever criterion of “sufficient”). Somewhat related.
By contrast, I’m not sure there is a similarly rationalizing explanation for why many EAs agree on both (i) there’s a moral imperative for cost-effectiveness, and (ii) you should one-box in Newcomb’s problem, and for why many know more about cognitive biases than about the leading theories for why the Industrial Revolution started in Europe rather than China.
Super interesting point!
I want to think about this more. Presently, I wouldn’t be surprised if (i) to (iii) all appealed more to a certain shape of mind – which could generate conformity along some axes.
It probably is, but I don’t think this explanation is rationalizing. I.e. I don’t think this founder effect would provide a good reason to think that this distribution of knowledge and opinions is conducive to reaching the community’s goals.
I don’t have relevant data nor have I thought very systematically about this, but my intuition is to strongly agree with basically everything you say.
In particular, I feel that the “Having exposure to a diverse range of perspectives and experiences is generally valuable.” squares fairly well with my own experience. There just are so many moving parts to how communities and organizations work—how to moderate meetings, how to give feedback, how much hierarchies and structure to have etc. etc. - that I think it’s fairly hard to even be aware of the full space of options (and impossible to experiment with a non-negligible fraction of it). Having an influx of people with diverse experiences in that respect can massively multiply the amount of information available on these intangible things. This seems particularly valuable to EA to me because I feel that relative to the community’s size there’s an unusual amount of conformity on these things within EA, perhaps due to the tight social connections within the community and the outsized influence of certain ‘cultural icons’.
Personally, I feel that I’ve learned a lot of the (both intellectual and interpersonal) skills that are most useful in my work right now outside of EA, and in fact that outside of EA’s core focus (roughly, what are the practical implications of ‘sufficiently consequentialist’ ethics) I’ve learned surprisingly little in EA even after correcting for only having been in the community for a small fraction of my life.
(Perhaps more controversially, I think this also applies to the epistemic rather than the purely cultural or organizational domain: I.e. my claim roughly is that things like phrasing lots of statement in terms of probabilities, having discussions mostly in Google docs vs. in person, the kind of people one circulates drafts to, how often one is forced to face a situation where one has to explain one’s thoughts to people one has never met before, and various small things like that affect the overall epistemic process in messy ways that are hard to track or anticipate other than by actually having experienced how several alternatives play out.)
Similar to Milan I agree with the main point of your comment and also think that the EA community conforms less than the majority of communities.
Maybe ironically, I also think that there is a relative lack of experience with communities in general among a lot of people interested in EA, which makes it harder for people to know what is expected, such as using group slang, strong identities, close connections and group ‘rituals’ which are very common in most communities.
Thank you, your comment made me realize both that I maybe wasn’t quite aware what meaning and connotations ‘community’ has for native speakers, and maybe that I was implicitly comparing EA against groups that aren’t a community in that sense. I guess it’s also quite unclear to me if I think it’s good for EA to be a community in this sense.
+1 to the general thrust of this.
Nitpick: probably not? e.g. compare to US social justice or US social conservatism, which are much larger movements (EA probably < 100,000 total; both of those probably ~ 500,00-10 mil total depending on who you count) and seem to be much more ideologically conforming.
Hmm, thanks for sharing your impression, I think talking about specific examples is often very useful to spot disagreements and help people learn from each other.
I’ve never lived in the US or otherwise participated in one of these communities, so I can’t tell from first-hand experience. But my loose impression is that there have been substantial disagreements both synchronically and diachronically within those movements; for example, in social justice about trans* issues or sex work, and in conservatism about interventionist vs. isolationist foreign policy, to name but a few examples. Of course, EAs disagree substantially about, say, their favored cause area. But my impression at least is that disagreements within those other movements can be much more acrimonious (jtbc, I think it’s mostly good that we don’t have this in EA), and also that the difference in ‘cultural vibe’ I would get from attending, say, a Black Lives Matters grassroots group meeting vs. a meeting of the Hilary Clinton presidential campaign team is larger than the one between the local EA group in Harvard and the EA Leaders Forum. Do your impressions of these things differ, or were you thinking of other manifestations of conformity?
(Maybe that’s comparing apples to oranges because a much larger proportion of EAs are from privileged backgrounds and in their 20s, and if one ‘controlled’ social justice and conservatism for these demographic factors they’d be closer to EA levels of conformity. OTOH maybe it’s something about EA that contributes to causing this demographic narrowness.)
Also, we have an explanation for the conformity within social justice and conservatism that on some readings might rationalize this conformity—namely Haidt’s moral foundations theory. To put it crudely, given that you’re motivated by fairness and care but not authority etc. maybe it just is rational to hold the ‘liberal’ bundle of views. (I think that’s true only to a limited but still significant extent, and also maybe that the story for why the mistakes reflected by the non-rational parts are so correlated is different from the one for EA in an interesting way.) By contrast, I’m not sure there is a similarly rationalizing explanation for why many EAs agree on both (i) there’s a moral imperative for cost-effectiveness, and (ii) you should one-box in Newcomb’s problem, and for why many know more about cognitive biases than about the leading theories for why the Industrial Revolution started in Europe rather than China.
I think the cultural vibe you would get at a Dank EA Memes meetup (e.g. “Dank EA Global 2018”) would be pretty different from the vibe at a Leverage meetup, and both of those pretty different from the vibe at a GiveWell happy hour.
Agree that there is likely more acrimony in social justice communities than in EA. I actually think this flows from their conformity, as I think there’s a lot of pressure to virtue signal & a lot of calling out when a person / group hasn’t virtue signaled sufficiently (for whatever criterion of “sufficient”). Somewhat related.
Super interesting point!
I want to think about this more. Presently, I wouldn’t be surprised if (i) to (iii) all appealed more to a certain shape of mind – which could generate conformity along some axes.
Is this not explained by founder effects from Less Wrong?
It probably is, but I don’t think this explanation is rationalizing. I.e. I don’t think this founder effect would provide a good reason to think that this distribution of knowledge and opinions is conducive to reaching the community’s goals.
Sure, but that just pushes the interesting question back a level – question becomes “why was LessWrong a viable project / Eliezer a viable founder?”
Which isn’t to say that there’s a good level of conformity in EA. I think EA would benefit from having less conformity.
But I think the base rate is really, really bad.