An Elephant in the Community Building room

These are my own views, and not those of my employer, EVOps, or of CEA who I have contracted for in the past and am currently contracting for now. This was meant to be a strategy fortnight contribution, but it’s now a super delayed/​unofficial, and underwritten strategy fortnight contribution. [1]

Before you read this:

This is pretty emotionally raw, so please 1) don’t update too much on it if you think I’m just being dramatic 2) I might come back and endorse or delete this at some point. I’ve put off writing this for a long time, because I know that some of the conclusions or implications might be hurtful or cause me to become even more unpopular than I already feel I am—as a result, I’ve left it really brief, but I’m willing to make it more through if I get the sense that people think it’d be valuable.


This post is not meant as a disparagement of any of my fellow African or Asian or Latin-American EAs. This is less about you, and more about how much the world sucks, and how hard the state of the world makes it for us to fully participate in, and contribute to, EA the way we’d like to. I think I’m hoping to read a bunch of comments proving me wrong or at least making me reconsider how I feel about this. That being said, I don’t like letting feelings get in the way of truth seeking and doing what’s right. So here it goes.

Summary:

I think community builders and those funding/​steering community building efforts should be more explicit and open about what their theory of change for global community building is (especially in light of the reduced amount of funding available), as there could be significant tradeoffs in impact between different strategies.

Introduction

I think there are two broad conceptualisations of what/​how EA functions in the world, and each has a corresponding community building strategy. If you think there are more than these two, or that these are wrong or could be improved, please let me know. From my experience, I think that all community building initiatives fall into one of two strategies/​worldviews, each with a different theory of change. These are:

Global EA

EA can be for anybody in the world—The goal of EA community building is to spread the ideas of EA as far and wide as possible. By showing people that regardless of your context, you can make a difference which is possibly hundreds of times better than you would have done otherwise, we’ll be increasing the chances of motivated and talented people getting involved in high-impact work, and generally increasing the counterfactual positive impact of humanity on the wellbeing of living and future beings. I have a sense that following this strategy currently leads to having a more transparent/​non-secretive/​less insidious optic for the movement.

Efforts which fall into this bucket would be things like:

  • funding city and national groups in countries which aren’t major power-centers in the US, UK, EU, or China

  • funding university groups which aren’t in the top 100200 in the world for subjects which have a track record of being well-represented amongst global decision-makers.

  • Allocating community resources to increasing blindly-racial or geographic diversity and inclusion in the community (rather than specific viewpoints or underrepresented moral beliefs etc).

Narrow EA [2]

Power and influence follow a heavy-tailed distribution, and we need power and influence to make important changes. If there is a small group of people who are extremely influential or high-potential, then the goal of community building should be to seek out and try to convince them to use their resources to have an outsized positive influence on the wellbeing of current and future beings. I have a sense that the way that this strategy is pursued often leads to an optic of secrecy and icky elite-control, but it doesn’t have to be this way necessarily.

Different people in the EA community believe more or less in each of these views to differing degrees—you can think either or both are right etc.

e.g.

  • Kameel thinks it is true that everyone can be an EA (and that’s great!), but the best thing for an EA community builder to do with their limited time and mental resources is to find the most privileged/​high-leverage/​high-potential/​influential people and get them to use their outsized influence on improving the lives of people/​animals/​future generations.

  • Meelak thinks its true that it’d be very effective to find the most influential and high-potential people and get them to make better decisions in order to help others, but he thinks the best thing for EA community builders to do is to spread the message that anyone can maximize their positive impact on the world regardless of their circumstances, and that we likely won’t ever find all the most impactful or talented people to work on the most pressing issues if we don’t cast out net far and wide, including into communities which might find it very hard to break into the typical EA-space otherwise.

Reasons I think this is important and should be addressed:

  • I used to believe strongly in EA, and the role of EA community building, as being that of “Global EA’, but I’ve become more convinced than ever that “narrow EA” is probably the thing we should pursue. There is a finite amount of funding available for EA community building, and it is necessary to figure out if one of these approaches has a higher impact on expectation than the other, and prioritize that approach.

    • I expect that this is a decision that has already been made, but it’s not like it’s written anywhere or stated explicitly, and so we have to just wonder and suspect that this is the case.

  • As more attention and resources are allocated to long termist causes, especially AI and biosecurity, as a feature of historical disparity in development and opportunity, it becomes less likely that aspiring EAs from around the world can contribute cost effectively to the EA community’s work either directly or in a field-building capacity.

  • Supporting a global EA community is expensive—e.g flying people to conferences in the US and UK from places like South Africa and India is often ~4X the price of local attendees travel costs; we have to sponsor travel and work visas. Its not clear that Eas from places far away or less well attended are any more skilled or high-leverage than people living near existing EA hubs, and this is happening whilst we’re nowhere near close to exhausting the pool of potential EA ‘recruits’ in places like the US and UK.

    • This is mostly thinking about *myself*: I’ve started feeling super guilty and sad about how much I, and the EA community, have wasted on supporting my participation in various community building and research endeavours—I’m not really any more capable or competent at doing the things I’ve done than a local American graduate would have been—there was no real justification for me to spend my own so much money moving to the US and staying here to work on these things whilst someone from Boston could have worked on them instead.

  • The lack of clarity and transparency from organizations like OpenPhil and CEA with regards to how they think about these models, or which of them they are pursuing, leads to a lot of emotional strife , and wasted time (experienced by both community builders, and people aspiring to be EAs).

Reasons not to address this:

  • CEA/​OP/​”EA” doesn’t want to be seen as outrightly endorsing the idea that the majority of people, especially those outside of highly-privileged circles in a handful of countries and cities, don’t have a part to play in the most important decisions about global wellbeing and the trajectory of human history. I think this point could be basically seen as “we don’t want to make the appearance of powerseeking” worse, but I think it speaks of deeper worries about being perceived as racist or classist or intolerable dismissive of people who ‘aren’t’ important enough’.

Reasons I might be wrong:

  • I might be significantly underestimating the value of diversity and inclusion (in the the most use of the phrase)

  • There is little empirical evidence for what I’m claiming (as far as I’m aware)

  • I’m assuming we are significanlty resource constrained and will remain so for some time.

  • This might be based on an entriely unrealistic false-dichotomy.

Other notes:

On diversity: I think oftentimes discussions about diversity in EA seem to point to the idea that we are failing when it comes to community building, because Global EA would result in a broad range of people being in EA. However, it’s obviously possible to believe that Narrow EA is true, and that diversity is really important in doing Narrow EA well. To demonstrate what I’m pointing out here:

A: “We should focus more on diversity and inclusion in EA”.

B: “That doesn’t make sense. We’re working on problems which could cause an extinction within our lifetimes, we can’t expend resources on something which is largely just a nod to political correctness or a lost sense of global justice”.

A: I think we’re losing out on some of the most talented people in the world who could be working on these issues”.

Etc.

About me:

I grew up in South Africa, and moved to the US for university in 2016. I have lived in Boston for ~7 years, and have worked on community building almost entirely in the context of US universities and local US groups, except for helping some non-US university groups as a UGAP mentor. I have thought a lot about community building in general, including in the context of Muslims for EA.

Thanks:

  • To those who encouraged me to write this, and those who reviewed it. [3]
    Important Disclaimer: Again, These are my own views as a member of the EA community, and not the views of my employer—EvOps, or of the Effective Ventures Foundation USA or UK. I have previously worked as a contractor for CEA on the groups and events teams.

  1. ^

    I’ve experienced an unacceptable amount of sadness when I’ve had to explain that “EA strategy fortnight” is a collective feedback contribution drive, and not the first-person shooter,collaborative prioritization game crossover-episode between Effective Altruism and Electronic Arts that we’ve always wanted.

  2. ^

    I also think we make a big mistake by not framing this publicly as some type of global justice/​distributive justice project—I think we’d avoid lots of powerseeking/​privileged-elite critiques if the public thought of the EA community as people trying to do the best they can for everyone else with the privilege and wealth they’re randomly fortunate to have.

  3. ^

    This is a joke—nobody reviewed this