EA skews young, white, male, and quantitative. Imagine you’re someone who doesn’t fit that profile but has EA values, and is trying to decide “is EA for me?” You go to EA Global (where the audience is not very diverse) and go to a Double Crux workshop. If most of the people talk about prioritizing adding AI researchers and hedge fund people (fields that skew young, male, and quanty) it might not feel very welcoming.
Basically, I think the question is framed so that it produces a negative externality for the community. And you could probably tweak the framing to produce a positive externality for the community, so I’d suggest considering that option unless there’s a compelling reason to favor the current framing. People can have a valuable discussion about which new perspectives would be helpful to add, even if they don’t think increasing diversity of perspectives is EA’s most important priority.
EA skews young, white, male, and quantitative. Imagine you’re someone who doesn’t fit that profile but has EA values, and is trying to decide “is EA for me?” You go to EA Global (where the audience is not very diverse) and go to a Double Crux workshop. If most of the people talk about prioritizing adding AI researchers and hedge fund people (fields that skew young, male, and quanty) it might not feel very welcoming.
Basically, I think the question is framed so that it produces a negative externality for the community. And you could probably tweak the framing to produce a positive externality for the community, so I’d suggest considering that option unless there’s a compelling reason to favor the current framing. People can have a valuable discussion about which new perspectives would be helpful to add, even if they don’t think increasing diversity of perspectives is EA’s most important priority.