We are incredibly homogenous

Preamble

This is an extract from a post called “Doing EA Better”, which argued that EA’s new-found power and influence obligates us to solve our movement’s significant problems with respect to epistemics, rigour, expertise, governance, and power.

We are splitting DEAB up into a sequence to facilitate object-level discussion.

Each post will include the relevant parts of the list of suggested reforms. There isn’t a perfect correspondence between the subheadings of the post and the reforms list, so not all reforms listed will be 100% relevant to the section in question.

Finally, we have tried (imperfectly) to be reasonably precise in our wording, and we ask that before criticising an argument of ours, commenters ensure that it is an argument that we are in fact making.

Main

Summary: Diverse communities are typically much better at accurately analysing the world and solving problems, but EA is extremely homogenous along essentially all dimensions. EA institutions and norms actively and strongly select against diversity. This provides short-term efficiency at the expense of long-term epistemic health.

The EA community is notoriously homogenous, and the “average EA” is extremely easy to imagine: he is a white male[9] in his twenties or thirties from an upper-middle class family in North America or Western Europe. He is ethically utilitarian and politically centrist; an atheist, but culturally protestant. He studied analytic philosophy, mathematics, computer science, or economics at an elite university in the US or UK. He is neurodivergent. He thinks space is really cool. He highly values intelligence, and believes that his own is significantly above average. He hung around LessWrong for a while as a teenager, and now wears EA-branded shirts and hoodies, drinks Huel, and consumes a narrow range of blogs, podcasts, and vegan ready-meals. He moves in particular ways, talks in particular ways, and thinks in particular ways. Let us name him “Sam”, if only because there’s a solid chance he already is.[10]

Even leaving aside the ethical and political issues surrounding major decisions about humanity’s future being made by such a small and homogenous group of people, especially given the fact that the poor of the Global South will suffer most in almost any conceivable catastrophe, having the EA community overwhelmingly populated by Sams or near-Sams is decidedly Not Good for our collective epistemic health.

As noted above, diversity is one of the main predictors of the collective intelligence of a group. If EA wants optimise its ability to solve big, complex problems like the ones we focus on, we need people with different disciplinary backgrounds[11], different kinds of professional training, different kinds of talent/​intelligence[12], different ethical and political viewpoints, different temperaments, and different life experiences. That’s where new ideas tend to come from.[13]

Worryingly, EA institutions seem to select against diversity. Hiring and funding practices often select for highly value-aligned yet inexperienced individuals over outgroup experts, university recruitment drives are deliberately targeted at the Sam Demographic (at least by proxy) and EA organisations are advised to maintain a high level of internal value-alignment to maximise operational efficiency. The 80,000 Hours website seems purpose-written for Sam, and is noticeably uninterested in people with humanities or social sciences backgrounds,[14] or those without university education. Unconscious bias is also likely to play a role here – it does everywhere else.

The vast majority of EAs will, when asked, say that we should have a more diverse community, but in that case, why are only a very narrow spectrum of people given access to EA funding or EA platforms? There are exceptions, of course, but the trend is clear.

It’s worth mentioning that senior EAs have done some interesting work on moral uncertainty and value-pluralism, and we think several of their recommendations are well-taken. However, the focus is firmly on individual rather than collective factors. The point remains that one cannot substitute a philosophically diverse community for an overwhelmingly utilitarian one where everyone individually tries to keep all possible viewpoints in mind. None of us are so rational as to obviate true diversity through our own thoughts.[15]

Suggested reforms

Below, we have a preliminary non-exhaustive list of relevant suggestions for structural and cultural reform that we think may be a good idea and should certainly be discussed further.

It is of course plausible that some of them would not work; if you think so for a particular reform, please explain why! We would like input from a range of people, and we certainly do not claim to have all the answers!

In fact, we believe it important to open up a conversation about plausible reforms not because we have all the answers, but precisely because we don’t.

Italics indicates reforms strongly inspired by or outright stolen from Zoe Cremer’s list of structural reform ideas. Some are edited or merely related to her ideas; they should not be taken to represent Zoe’s views.

Asterisks (*) indicate that we are less sure about a suggestion, but sure enough that we think they are worth considering seriously, e.g. through deliberation or research. Otherwise, we have been developing or advocating for most of these reforms for a long time and have a reasonable degree of confidence that they should be implemented in some form or another.

Timelines are suggested to ensure that reforms can become concrete. If stated, they are rough estimates, and if there are structural barriers to a particular reform being implemented within the timespan we suggest, let us know!

Categorisations are somewhat arbitrary, we just needed to break up the text for ease of reading.

Critique

Red Teams

  • Red teams should be paid, composed of people with a variety of views, and former- or non-EAs should be actively recruited for red-teaming

    • Interesting critiques often come from dissidents/​exiles who left EA in disappointment or were pushed out due to their heterodox/​”heretical” views (yes, this category includes a couple of us)

  • The judging panels of criticism contests should include people with a wide variety of views, including heterodox/​”heretical” views

Epistemics

General

  • EA should study social epistemics and collective intelligence more, and epistemic efforts should focus on creating good community epistemics rather than merely good individual epistemics

    • As a preliminary programme, we should explore how to increase EA’s overall levels of diversity, egalitarianism, and openness

  • EAs should practise epistemic modesty

    • We should read much more, and more widely, including authors who have no association with (or even open opposition to) the EA community

    • We should avoid assuming that EA/​Rationalist ways of thinking are the only or best ways

    • We should actively seek out not only critiques of EA, but critiques of and alternatives to the underlying premises/​assumptions/​characteristics of EA (high modernism, elite philanthropy, quasi-positivism, etc.)

    • We should stop assuming that we are smarter than everybody else

  • EAs should make a point of engaging with and listening to EAs from underrepresented disciplines and backgrounds, as well as those with heterodox/​“heretical” views

Ways of Knowing

  • EAs should consider how our shared modes of thought may subconsciously affect our views of the world – what blindspots and biases might we have created for ourselves?

  • EAs should increase their awareness of their own positionality and subjectivity, and pay far more attention to e.g. postcolonial critiques of western academia

    • History is full of people who thought they were very rational saying very silly and/​or unpleasant things: let’s make sure that doesn’t include us

  • EAs should study other ways of knowing, taking inspiration from a range of academic and professional communities as well as indigenous worldviews

Diversity

  • EA institutions should select for diversity

    • With respect to:

      • Hiring (especially grantmakers and other positions of power)

      • Funding sources and recipients

      • Community outreach/​recruitment

    • Along lines of:

      • Academic discipline

      • Educational & professional background

      • Personal background (class, race, nationality, gender, etc.)

      • Philosophical and political beliefs

    • Naturally, this should not be unlimited – some degree of mutual similarity of beliefs is needed for people to work together – but we do not appear to be in any immediate danger of becoming too diverse

  • Previous EA involvement should not be a necessary condition to apply for specific roles, and the job postings should not assume that all applicants will identify with the label “EA”

  • EA institutions should hire more people who have had little to no involvement with the EA community providing that they care about doing the most good

  • People with heterodox/​“heretical” views should be actively selected for when hiring to ensure that teams include people able to play “devil’s advocate” authentically, reducing the need to rely on highly orthodox people accurately steel-manning alternative points of view

  • Community-building efforts should be broadened, e.g. involving a wider range of universities, and group funding should be less contingent on the perceived prestige of the university in question and more focused on the quality of the proposal being made

  • EA institutions and community-builders should promote diversity and inclusion more, including funding projects targeted at traditionally underrepresented groups

  • A greater range of people should be invited to EA events and retreats, rather than limiting e.g. key networking events to similar groups of people each time

  • There should be a survey on cognitive/​intellectual diversity within EA

  • EAs should not make EA the centre of their lives, and should actively build social networks and career capital outside of EA

Expertise & Rigour

Reading

  • Insofar as a “canon” is created, it should be of the best-quality works on a given topic, not the best works by (orthodox) EAs about (orthodox) EA approaches to the topic

    • Reading lists, fellowship curricula, and bibliographies should be radically diversified

    • We should search everywhere for pertinent content, not just the EA Forum, LessWrong, and the websites of EA orgs

    • We should not be afraid of consulting outside experts, both to improve content/​framing and to discover blind-spots

Experts & Expertise

  • EAs should deliberately broaden their social/​professional circles to include external domain-experts with differing views

  • When hiring for research roles at medium to high levels, EA institutions should select in favour of domain-experts, even when that means passing over a highly “value-aligned” or prominent EA

Funding & Employment

Grantmaking

  • Grantmakers should be radically diversified to incorporate EAs with a much wider variety of views, including those with heterodox/​”heretical” views

Transparency & Ethics

Moral Uncertainty

  • EAs should practise moral uncertainty/​pluralism as well as talking about it

  • EAs who advocate using ethical safeguards such as “integrity” and “common-sense morality” should publicly specify what they mean by this, how it should be operationalised, and where the boundaries lie in their view

  • EA institutions that subscribe to moral uncertainty/​pluralism should publish their policies for weighting different ethical views within 12 months