The most important century and the representativeness of EA

tldr: The more important this century is, the more important it is to increase EA’s representativeness of the human population this century by recruiting members from developing countries who can contribute to priority setting. This is due to value lock in and the variance in moral values across cultures.

Holden Karnofsky argues that we are living in the most important century, and that “[w]e, the people living in this century, have the chance to have a huge impact on huge numbers of people to come.”

He also says that “there is a chance of ‘value lock-in’ here: whoever is running the process of space expansion might be able to determine what sorts of people are in charge of the settlements and what sorts of societal values they have, in a way that is stable for many billions of years.”

He is specifically talking about space expansion, but I think the same possibility applies to other long-termist projects such as AI alignment and pandemic preparedness.

If this is true, then I believe it increases the moral imperative to ensure that the Effective Altruism community is representative of humanity’s many different cultures and value systems.

According to the 2019 EA survey, “74% of EAs in the survey currently live in the [a] set of 5 high-income English-speaking western countries.” Additionally, “EAs living outside of the USA and Europe reported the largest shares of non-engaged or only mildly engaged EAs, possibly stemming from their obstacles to participating in ‘high engagement activities’.” Demographics from CEA events and the 2020 EA survey put members of the EA Community at ~50-75% white.

EA tends to draw members from—and at times organizations explicitly focus recruitment on—elite schools in countries like the US and UK. For example, India is the only developing country included in the CEA’s list of locations eligible for Community Building Grants. The only eligible university group outside the US and UK is University of Hong Kong.

The brief reasoning for the selectivity is that these locations “especially high priority in terms of growing EA presence globally”—developing countries are not a priority. And universities are prioritized based on “track record of having highly influential graduates (e.g. Nobel prize winners, politicians, major philanthropists).”

There would be a reasonable justification for this if EA was focused only on attracting high-earners in rich countries to donate money to people in poor countries. Or if the EA community has already figured out all the answers to the most important moral questions, and just needs to attract highly influential people to implement successful change.

But EA is not just about sending money to the global poor, and we do not have all the answers yet. An important part of the Effective Altruism project is a search for and discussion about which moral values are most important so that we can maximize those values. Long-termism requires making value judgments about what is best for humanity in the long term. It involves grappling with questions like:

  • What principles should govern space exploration and potential colonization of other worlds?

  • What role should AGI play in human societies?

  • How should tradeoffs between medical privacy and the ability to respond quickly to global pandemics be made?

Answers to these sorts of questions of value vary from culture to culture. And to the extent that values determined by the EA community now may “lock in”, it is critical that the values of the community reflect the values of humanity as a whole. This is made even more important by demographic trends: Pew projects that in 2100 half of babies will be born in Africa, which is one of the areas of the world least represented in EA.

What do we do about this? The one potential tactical recommendation I would offer is this: Anecdotally, I’ve heard that many students in Nairobi have a hard time connecting easily with EA ideas because so much introductory EA material is pitched at people from rich countries. Perhaps creation of new intro-to-EA materials and tweaking of existing materials could aid EA community organizers in attracting new members who can contribute to EA community discussions.

I don’t have any clear answers here, but would be very curious to see if others here agree with the assessment that the more important this century is, the more important it is to increase EA’s representativeness of the human population this century.