EA has a diversity problem. Proposal: identity/​affinity subgroups

tldr; Many folks have recently pointed out that diversity is an issue in EA community. I agree. I think the EA community could benefit from the creation of subgroups based on identity and affinity to make a wider variety of humans feel welcome, heard, and safe, and to reduce ideological bias.

Note: If you agree that there is a diversity issue in EA, the top half of this post may not contain new information for you. Feel free to jump down to the Proposal to help mitigate diversity issues section below.

Background

As someone new to being involved with EA, I find myself asking myself this question:

EA’s ideas about how to do good seem unique and unconventional (e.g. longtermism). Are they this way as a result of uniquely clear and farsighted thinking, or a result of being out of touch with reality?

It’s hard for me to answer this question with confidence. Lack of ideological (and demographic) diversity is a means by which the latter possibility (that EAs have an unusually biased view of reality) could manifest.

Signs of a diversity problem

Demographic uniformity

  • communities that are whiter and more male than the broader population

  • lack of older members

  • members seem to be disproportionately from wealthy backgrounds

  • members disproportionately college grads/​phds from elite universities

Ideological uniformity

  • bias toward solving problems via debate (in which the winner is often viewed as more right than the loser, even if loser has valuable information to add)

  • wide variation in upvotes/​response rates on EA forum. Unpopular topics/​views sometimes get little attention and receive little feedback on EA forum

  • a natural tendency to believe ideas presented with those who have high status within the group over ideas

  • use of quantitative estimation to create a false sense of precision when discussing uncertain events

  • bias towards ideas that are largely unpopular in society, and even among many altruists, like utilitarianism, longtermism, and technocratic utopianism

  • some topics like AI alignment are frequently featured in EA forums, whereas similar topics like AI ethics are rarely seen (possibly more due to ideological divides more than pure utility)

  • bias towards problems with technology-oriented solutions (AI alignment) and against problems with human-oriented solutions (politics, racism)

Proposal to help mitigate diversity issues

  • Encourage and facilitate the creation of subgroups based on affinity/​identity, where populations in the minority can have a majority voice

  • Conferences where speakers from these subgroups are featured

  • Add info to subforums page about these different groups, for observability

  • Encourage people to tag forum posts with all relevant affinity/​identity groups, so it’s possible to view breakdowns by group. That said, encourage posting in the same top-level forum so these ideas are discoverable

Example groups

Identity

  • women in EA

  • lgbtq in EA

  • PoC in EA

  • EA buddhists (or other religion)

Affinity

  • EA longtermists

  • EAs for short-term causes

  • EAs for AI ethics

  • EAs at startups

  • EAs for social justice

  • EA democrats (or other political group)

Shared interests or life situations

  • new to EA

  • EAs with kids

  • EA singles

  • EAs in software (or other occupation). (I believe this example already exists as a forum topic)

  • EAs who are middle aged or older

  • EAs from (part of world)

  • EAs who speak X first language

Proposal details

Implementation

I’d like to see a subgroup founder/​facilitator toolkit that provides the following info for anyone who wants to form a new group:

  • a guide to help inexperienced EAs through the process of becoming organizers

  • tips for how to recruit members to your subgroup

  • tips for discoverability (e.g. how to explain your group to new members and how to become visible to all EAs who may be interested)

  • advice on how to facilitate (virtual & in-person) meetings in a way that is safe for members and encourages effective communication

  • advice on getting funding for events (or how to raise money internally to the group)

  • advice on measuring the effectiveness of your group

  • guidelines for being an “official” part of EA; some amount of alignment with the larger goals of EA should be required

Benefits

  • making it easier to recruit and attract non-EAs from different demographic groups

  • giving folks with similar views/​backgrounds/​identities a community and a shared voice

  • helping EA interface in a healthy way with EA-adjacent folks who belong to various other groups with overlapping missions

  • making it possible (eg via subforums) to easily discover what issues are most important to various subgroups, to help counteract blind spots

Final note

This my first post on EA (for draft amnesty day). If you have criticism or meta-feedback about how to make a good post, that is more than welcome! I want to know how to communicate as effectively as possible on this forum. Thanks!