An example of a particular practice that I think might look kind of innocuous but can be quite harmful to women and minorities in EA is what I’m going to call “buzz talk”. Buzz talk involves making highly subjective assessments of people’s abilities, putting a lot of weight in those assessments, and communicating them to others in the community. Buzz talk can be very powerful, but the beneficiaries of buzz seem to disproportionately be those that conform to a stereotype of brilliance: a white, upper class male might be “the next big thing” when his black, working class female counterpart wouldn’t even be noticed. These are the sorts of small, unintentional behaviors that I that it can be good for people to try to be conscious of.
I also think it’s really unfortunate that there’s such a large schism between those involved in the social justice movement and people who largely disagree with this movement (think: SJWs and anti-SJWs). The EA community attracts people from both of groups, and I think it can cause people to see this whole issue through the lens of whatever group they identify with. It might be helpful if people tried to drop this identity baggage when discussing diversity issues in EA.
I strongly agree. Put another way, I suspect we, as a community, are bad at assessing talent. If true, that manifests as both a diversity problem and a suboptimal distribution of talent, but the latter might not be as visible to us.
My guess re the mechanism: Because we don’t have formal credentials that reflect relevant ability, we rely heavily on reputation and intuition. Both sources of evidence allow lots of biases to creep in.
My advice would be:
When assessing someone’s talent, focus on the content of what they’re saying/writing, not the general feeling you get from them.
When discussing how talented someone is, always explain the basis of your view (e.g., I read a paper they wrote; or Bob told me).
Variant on this idea: I’d encourage a high status person and a low status person, both of whom regularly post on the EA Forum, to trade accounts for a period of time and see how that impacts their likes/dislikes.
Variant on that idea: No one should actually do this, but several people should talk about it, thereby making everyone paranoid about whether they’re a part of a social experiment (and of course the response of the paranoid person would be to actually vote based on the content of the article).
Variant on this idea: I’d encourage a high status person and a low status person, both of whom regularly post on the EA Forum, to trade accounts for a period of time and see how that impacts their likes/dislikes.
Problem is that the participants would not be not blinded, so they would post differently. People act to play the role that society gives them.
So I think that if you identify with or against some group (e.g. ‘anti-SJWs’), then anything that people say that pattern matches to something that this group would say triggers a reflexive negative reaction. This manifests in various ways: you’re inclined to attribute way more to the person’s statements than what they’re actually saying or you set an overly demanding bar for them to “prove” that what they’re saying is correct. And I think all of that is pretty bad for discourse.
I also suspect that if we take a detached attitude towards this sort of thing, disagreements about things like how much of a diversity problem EA has or what is causing it would be much less prominent than they currently are. These disagreements only affect benefits we expect to directly accrue from trying to improve things, but the costs of doing these things are usually pretty low and the information value of experimenting with them is really high. So I don’t really see many plausible views in this area that would make it rational to take a strong stance against a lot of the easier things that people could try that might increase the number of women and minorities that get involved with EA.
So I think that if you identify with or against some group (e.g. ‘anti-SJWs’), then anything that people say that pattern matches to something that this group would say triggers a reflexive negative reaction
Agreed. I’m not sure how we escape from that trap, except by avoiding loaded terms, even at the expense of brevity.
So I think that if you identify with or against some group (e.g. ‘anti-SJWs’), then anything that people say that pattern matches to something that this group would say triggers a reflexive negative reaction. This manifests in various ways: you’re inclined to attribute way more to the person’s statements than what they’re actually saying or you set an overly demanding bar for them to “prove” that what they’re saying is correct. And I think all of that is pretty bad for discourse.
This used to be me… It wasn’t so much my beliefs that changed (I’m not a leftist/feminist/etc). It was more a change in attitude, related to why I rejected ultra-strict interpretations of utilitarianism. Not becoming more agreeable or less opinionated… just not feeling like I was on a life-or-death mission. Anyway, happy to discuss these things privately, including with people who are still on the anti-SJW mission.
An example of a particular practice that I think might look kind of innocuous but can be quite harmful to women and minorities in EA is what I’m going to call “buzz talk”. Buzz talk involves making highly subjective assessments of people’s abilities, putting a lot of weight in those assessments, and communicating them to others in the community. Buzz talk can be very powerful, but the beneficiaries of buzz seem to disproportionately be those that conform to a stereotype of brilliance: a white, upper class male might be “the next big thing” when his black, working class female counterpart wouldn’t even be noticed. These are the sorts of small, unintentional behaviors that I that it can be good for people to try to be conscious of.
I also think it’s really unfortunate that there’s such a large schism between those involved in the social justice movement and people who largely disagree with this movement (think: SJWs and anti-SJWs). The EA community attracts people from both of groups, and I think it can cause people to see this whole issue through the lens of whatever group they identify with. It might be helpful if people tried to drop this identity baggage when discussing diversity issues in EA.
I strongly agree. Put another way, I suspect we, as a community, are bad at assessing talent. If true, that manifests as both a diversity problem and a suboptimal distribution of talent, but the latter might not be as visible to us.
My guess re the mechanism: Because we don’t have formal credentials that reflect relevant ability, we rely heavily on reputation and intuition. Both sources of evidence allow lots of biases to creep in.
My advice would be:
When assessing someone’s talent, focus on the content of what they’re saying/writing, not the general feeling you get from them.
When discussing how talented someone is, always explain the basis of your view (e.g., I read a paper they wrote; or Bob told me).
How we do we know that we are not bad at assessing talent in the opposite direction?
Maybe voters on the EA forum should be blinded to the author of a post until they’ve voted!
Variant on this idea: I’d encourage a high status person and a low status person, both of whom regularly post on the EA Forum, to trade accounts for a period of time and see how that impacts their likes/dislikes.
Variant on that idea: No one should actually do this, but several people should talk about it, thereby making everyone paranoid about whether they’re a part of a social experiment (and of course the response of the paranoid person would be to actually vote based on the content of the article).
Problem is that the participants would not be not blinded, so they would post differently. People act to play the role that society gives them.
I appreciate this comment for being specific!
I don’t understand what you mean by that; could you clarify?
So I think that if you identify with or against some group (e.g. ‘anti-SJWs’), then anything that people say that pattern matches to something that this group would say triggers a reflexive negative reaction. This manifests in various ways: you’re inclined to attribute way more to the person’s statements than what they’re actually saying or you set an overly demanding bar for them to “prove” that what they’re saying is correct. And I think all of that is pretty bad for discourse.
I also suspect that if we take a detached attitude towards this sort of thing, disagreements about things like how much of a diversity problem EA has or what is causing it would be much less prominent than they currently are. These disagreements only affect benefits we expect to directly accrue from trying to improve things, but the costs of doing these things are usually pretty low and the information value of experimenting with them is really high. So I don’t really see many plausible views in this area that would make it rational to take a strong stance against a lot of the easier things that people could try that might increase the number of women and minorities that get involved with EA.
Agreed. I’m not sure how we escape from that trap, except by avoiding loaded terms, even at the expense of brevity.
This used to be me… It wasn’t so much my beliefs that changed (I’m not a leftist/feminist/etc). It was more a change in attitude, related to why I rejected ultra-strict interpretations of utilitarianism. Not becoming more agreeable or less opinionated… just not feeling like I was on a life-or-death mission. Anyway, happy to discuss these things privately, including with people who are still on the anti-SJW mission.