Your portrait of what the EA community could be is a beautiful one and made me tear up. You hit the nail on the head many times in this post on the subtle connections between things that I think can be hard to identify: the connection between heart and head, the E and the A, the overuse of jargon, and the hero worship, and so on. I have to say that as a fairly straight-passing gay man with immense amounts of privilege, even I feel many of these pressures and am often put off by the alpha-male machismo you often see in EA spaces.
I’ve witnessed discrimination and harassment, and heard of assault, in EA-ish spaces, and it seems pretty clear that this is contributing to the gender gap. I’ve definitely exhibited some of the combative and argumentative behaviors you mention. When I got into the EA community a few years ago, I began in global poverty and animal advocacy circles, and I found they were much better on these issues than the community is now, sadly. (That’s with both of those areas’ having plenty of problems.)
I think Kelly moved us toward a type of dialogue on this issue that is lacking in the world, and I hope we can have more of it. Right now, discussions around diversity and inclusion seem polarized between the sort of “rationalist” discussion that’s snarky and dismissive on the one hand and an ostracizing mob mentality on the other hand. I don’t want to say EA should chart a middle path, because I think we should lean toward being overly zealous on diversity and inclusion rather than away, but I think EA and its aligned movements (animal advocacy in my mind) would benefit from a conversation that is at the same time inclusive and data-based. I don’t think the world has that type of conversation very often.
The lack of conversations that are both inclusive and data-based seems to lead to pretty bad results, where diversity and inclusion are may not be promoted in the most effective ways, and people opposed to diversity and inclusion harbor suspicions about the world (e.g. that discrimination does not exist) that continue to fester unaddressed.
From my exploration of these matters, I’ve come to see that generally, when one reads about data on discrimination, differences between groups, etc. one finds that (a) discrimination exists and can be quite powerful; (b) there are differences between genders, but the differences are subtle and go in varied directions (e.g. men are more combative, and women are more collaborative, as Kelly notes); and (c ) these differences are not the reason for the vast majority of gaps that we see.
I think that because discussion about differences between genders is often consigned to the more diversity-hostile corners of the internet, though, ideas that would be proven wrong by the data go unchallenged. Again, I think if we were to have the right sort of conversation on these issues, we would find that discrimination is indeed the primary cause of the gender gap in EA, but without that conversation, people will not be convinced. (And if an honest conversation engaged with data and personal experiences came to the conclusion that this was not the case, that would probably be good information to have.)
For instance, I read the Damore memo, but then saw this graph which seems to be pretty good evidence that the vast majority of the gap in tech is not from biological differences (and so likely some iteration of discrimination, implicit or explicit). I don’t remember where I came across this graph, but it was very helpful to me. Without looking at the whole picture, though, one can look solely at the individual components of the picture (e.g. Damore’s arguments on specific gender differences) and come to conclusions that would be put in doubt with fuller information.
As an additional reason why I think EA is a movement that could have the right conversation on this, I think that EAs recognize a moral principle similar to equality of interests, where differences in personal traits do not lead to moral differences. It seems that in many diversity and inclusion conversations, both the right and the left consider personal trait differences to imply moral differences, and I think EAs can challenge and move beyond that assumption–though with care and only after we start improving on our demographics.
This is a very challenging issue because, as noted in a comment below, racism and sexism have long been motivated by biological essentialism, and it’s extremely disturbing to have people talk about a group you are a part of in this way. (As a Jew, I can say that I feel discomfort with the conversation about Jewish values below, for instance, though I don’t have a strong opinion on its propriety.) I think that the way to deal with this problem is to exercise caution when speaking about these sorts of things, to avoid casual discussion of them, and to have a higher evidence standard for when we talk about these things. I think that our community can learn the appropriate maturity to do that, though.
Anyway, all this is to say that I hope that as this conversation goes on, we can bring data to bear and recognize the implications of the way we speak for others in this community. Words and ideas do cause harm, and we should be utilitarians about the way we speak. With appropriate caution, though, I think that EAs can have a conversation that gets to the heart of the matter and offers a model for how these conversations can be had.
For those looking for examples of places where these discussions could be valuable, I have a few:
Gender and cosmopolitan values–The Better Angels of Our Nature cites feminism as one of the reasons for declines in all sorts of violence (war, sexual violence, torture), and I’ve seen enough data to match my intuition that feminism is also very good for animals. I think there are lots of things to explore empirically in this domain (that likely would have implications for the A vs. E debate), but they probably involve engaging with uncomfortable questions about where these gender differences arise.
On another note, animal advocates will often assert that if we focus on multiple causes, we will solve our diversity and inclusion problem. I think this is a very important claim to test, because focusing on multiple causes may be quite costly. I’m fully supportive of focusing on creating justice within our movements and groups, e.g. by aggressively fighting sexual assault and getting rid of income barriers, but I think the claim about movements’ outward focus is a debatable one that really needs to be empirically explored.
Similarly to the above note, animal advocates often work on issues to promote diversity and inclusion including things like fighting urban food deserts without looking into the evidence around them. This could not only hinder direct impacts but also create the impression that advocates’ diversity and inclusion efforts are an afterthought without the same rigor applied to it that advocates apply to work for animals.
Just in reply to the graph section—this post made me think about possible reasons for the discrepancy between computer science and law/medicine.
Yeah, I’ve read that and think there are very good points in there. I think I’d actually thought the graph said “physics” rather than “physical sciences,” so I now realize I misread it a bit. I do think that SSC piece leaves two questions open though:
First, do we think that EA should be more like physics or more like medicine? This probably speaks to the E vs. A question Kelly addressed above. I think EA could benefit from having more people in it emphasizing the A. This is something we should all talk about at length, though.
Second, even if there are gender differences in interest that mean that an equitable distribution in a field would be unequal, the gap may be larger than what the differences suggest. I think that’s actually what we should expect: in fields that men are more interested in, the higher concentration of men should breed more sexism, and the gap should be inflated.
“fighting urban food deserts without looking into the.”
I think there’s a word or phrase missing.