You’re in charge of outreach for EA. You have to choose one demographic to focus on for introducing EA concepts to, and bringing into the movement. What single demographic do you prioritize?
What sort of discussions does this question generate? Do people mostly discuss demographics that are currently overrepresented or underrepresented in EA? If there’s a significant amount of discussion around how and why EA needs more of groups that are already overrepresented, it probably wouldn’t feel very welcoming to someone from an underrepresented demographic. You may want to consider tweaking it to something like “What underrepresented demographic do you think EA most needs more of on the margins?”
FWIW, I have similar concerns that people might interpret the question about lying/misleading as suggesting EA doesn’t have a strong norm against lying.
Aside from Ops people, I’d guess the other five groups are already strongly overrepresented in EA. This exercise may be sending an unintended message that “EA wants more of the same”, and I suspect you could tweak the question to convey “EA values diverse perspectives” without sacrificing any quality in the discussion. Over the long-term, you’ll get much better discussions because they’ll incorporate a broader set of perspectives.
I’m not sure I follow. The question asks what the participants think is most important, which may or may not be diversity of perspectives. At least some people think that diversity of perspectives is a misguided goal, that erodes core values.
Are you saying that this implies that “EA wants more of the same” because some new EA (call him Alex) will be paired with a partner (Barbra) who gives one of the above answers, and then Alex will presume that what Barbra said was the “party line” or “the EA answer” or “what everyone thinks”?
EA skews young, white, male, and quantitative. Imagine you’re someone who doesn’t fit that profile but has EA values, and is trying to decide “is EA for me?” You go to EA Global (where the audience is not very diverse) and go to a Double Crux workshop. If most of the people talk about prioritizing adding AI researchers and hedge fund people (fields that skew young, male, and quanty) it might not feel very welcoming.
Basically, I think the question is framed so that it produces a negative externality for the community. And you could probably tweak the framing to produce a positive externality for the community, so I’d suggest considering that option unless there’s a compelling reason to favor the current framing. People can have a valuable discussion about which new perspectives would be helpful to add, even if they don’t think increasing diversity of perspectives is EA’s most important priority.
I made different points, but in this comment I’m generally concerned doing something like this at big EA events could publicly misrepresent and oversimplify a lot of issues EA deals with.
What sort of discussions does this question generate? Do people mostly discuss demographics that are currently overrepresented or underrepresented in EA? If there’s a significant amount of discussion around how and why EA needs more of groups that are already overrepresented, it probably wouldn’t feel very welcoming to someone from an underrepresented demographic. You may want to consider tweaking it to something like “What underrepresented demographic do you think EA most needs more of on the margins?”
FWIW, I have similar concerns that people might interpret the question about lying/misleading as suggesting EA doesn’t have a strong norm against lying.
Here are demographics that I’ve heard people list.
AI researchers (because of relevance to x-risk)
Teachers (for spreading the movement)
Hedge fund people (who are rich and analytical)
Startup founders (who are ambitious and agenty)
Young people/ college students (because they’re the only people that can be sold on weird ideas like EA)
Ops people (because 80k and CEA said that’s what EA needs)
All of these have very different implications about what is most important on the margin in EA.
Aside from Ops people, I’d guess the other five groups are already strongly overrepresented in EA. This exercise may be sending an unintended message that “EA wants more of the same”, and I suspect you could tweak the question to convey “EA values diverse perspectives” without sacrificing any quality in the discussion. Over the long-term, you’ll get much better discussions because they’ll incorporate a broader set of perspectives.
I’m not sure I follow. The question asks what the participants think is most important, which may or may not be diversity of perspectives. At least some people think that diversity of perspectives is a misguided goal, that erodes core values.
Are you saying that this implies that “EA wants more of the same” because some new EA (call him Alex) will be paired with a partner (Barbra) who gives one of the above answers, and then Alex will presume that what Barbra said was the “party line” or “the EA answer” or “what everyone thinks”?
EA skews young, white, male, and quantitative. Imagine you’re someone who doesn’t fit that profile but has EA values, and is trying to decide “is EA for me?” You go to EA Global (where the audience is not very diverse) and go to a Double Crux workshop. If most of the people talk about prioritizing adding AI researchers and hedge fund people (fields that skew young, male, and quanty) it might not feel very welcoming.
Basically, I think the question is framed so that it produces a negative externality for the community. And you could probably tweak the framing to produce a positive externality for the community, so I’d suggest considering that option unless there’s a compelling reason to favor the current framing. People can have a valuable discussion about which new perspectives would be helpful to add, even if they don’t think increasing diversity of perspectives is EA’s most important priority.
I made different points, but in this comment I’m generally concerned doing something like this at big EA events could publicly misrepresent and oversimplify a lot of issues EA deals with.