Better models for EA development: a network of communities, not a global community

[EDIT: I recom­mend to first read this com­ment which clar­ifies the rai­son d’être of this post. The post should make more sense af­ter­wards.]

Intro

When­ever this post men­tions ‘we’, it refers to EA Geneva.

This ar­ti­cle aims to con­tribute to de­vel­op­ing a bet­ter un­der­stand­ing of ‘com­mu­nity build­ing’. It is the re­sult of many dis­cus­sions in­di­cat­ing that some con­sid­er­a­tions re­gard­ing ter­minol­ogy, pro­fes­sion­al­iza­tion, and group dy­nam­ics have not yet been suffi­ciently talked about or doc­u­mented. That seems in part due to (i) a lack of peo­ple with time to think about EA net­work strat­egy and (ii) the fact that the peo­ple who do think about it ex­change mostly in­for­mally.

We hope to con­tribute to the model-build­ing pro­cess by pre­sent­ing some of our cur­rent thoughts—not as con­clu­sions but to nour­ish dis­cus­sion.

Summary

  • Com­mu­ni­ties are groups of hu­mans with a shared iden­tity who care for each other.

  • The EA net­work is a set of or­ga­ni­za­tions, com­mu­ni­ties and in­di­vi­d­u­als who share a mis­sion but can­not all care for one an­other.

  • Build­ing real com­mu­ni­ties is done through psy­cholog­i­cal safety and com­mon growth, growth is fa­cil­i­tated by shared work.

  • Devel­op­ing the net­work is done by al­ign­ing in­cen­tives and pro­fes­sional co­or­di­na­tion.

  • We try to dis­en­tan­gle first re­spon­si­bil­ities and goals to im­prove net­work de­vel­op­ment and clar­ify what com­mu­nity build­ing en­tails.

Un­der­stand­ing and defin­ing ‘com­mu­nity’

In his 2018 EA Global open­ing speech, William MacAskill com­pares the Spar­tans to the Athe­ni­ans. Spar­tans get stuff done through cen­tral­ised en­force­ment of con­for­mity. Athe­ni­ans hope for the best ar­gu­ment to win through the open-ended cel­e­bra­tion of ar­gu­ment. A neat set-up to then define EA through its meth­ods, val­ues, norms and cul­ture.

In his talk, MacAskill refers to those who iden­tify with the defi­ni­tion of EA as the EA ‘com­mu­nity’. But this broad un­der­stand­ing of ‘com­mu­nity’ is caus­ing con­fu­sion be­cause much of what is cur­rently called com­mu­nity does not always look al­ike. By us­ing ‘com­mu­nity’ as a generic term, an es­sen­tial part of a pow­er­ful com­mu­nity is diluted:

A com­mu­nity is a group of hu­mans with a shared iden­tity who care for each other.

EAs as­pire to care for ev­ery­one, of course, but in prac­tice, no sin­gle in­di­vi­d­ual can take care of ev­ery­one else. Our limited time and en­ergy pose hard con­straints on even the most car­ing hu­mans. The above defi­ni­tion brings about the re­al­i­sa­tion that a global com­mu­nity is just noth­ing that cur­rent hu­man be­ings can achieve.

Clear dis­tinc­tions can help to bet­ter define ac­tivity port­fo­lios, re­spon­si­bil­ities and to set re­al­is­tic ex­pec­ta­tions. We sug­gest the fol­low­ing defi­ni­tions:

  • An as­piring effec­tive al­tru­ist (EA): an in­di­vi­d­ual that iden­ti­fies with EA prin­ci­ples.

  • An EA com­mu­nity: a group of as­piring EAs who take care of each other.

  • An EA or­ga­ni­za­tion: a group of as­piring EAs col­lec­tively con­tribut­ing to EA’s mis­sion (any­thing from lo­cal groups to EA Czechia to spe­cial­ised orgs like the Open Philan­thropy Pro­ject).

  • The EA net­work: the global set of in­di­vi­d­u­als and or­ga­ni­za­tions who iden­tify with EA prin­ci­ples.

One could illus­trate the defi­ni­tions hi­er­ar­chi­cally as fol­lows:

But it is prob­a­bly more cor­rect to as­sume that (i) not all as­piring EAs have found a com­mu­nity within the EA net­work; (ii) not all EA or­ga­ni­za­tions ex­clu­sively need as­piring EAs as em­ploy­ees; (iii) EA com­mu­ni­ties can con­tain min­i­mal amounts of not-en­tirely-al­igned in­di­vi­d­u­als; and that (iv) some or­ga­ni­za­tions func­tion with­out a com­mu­nity at their core:

Whether we call peo­ple ‘as­piring EAs’ or just ‘EAs’; whether we want to say ‘EA net­work’ or ‘sys­tem’ or ‘nexus’ - the point is that the EA net­work needs a clear un­der­stand­ing of what it means by the term ‘com­mu­nity’, and what oth­ers un­der­stand by it.

The rest of this ar­ti­cle will fur­ther illus­trate the case for defin­ing ‘EA com­mu­nity’ along the lines of ‘a group of as­piring EAs who take care of each other’ and for la­bel­ling the other things in very dis­tinct man­ner.

Break­ing down cur­rent ‘com­mu­nity build­ing’ efforts

A vast ar­ray of ac­tivi­ties has so far been la­bel­led ‘com­mu­nity build­ing’ - to many com­mu­nity builders’ con­fu­sion. The term seems to have origi­nated from this line of thought:

  1. It is good to have peo­ple who:

    1. Thor­oughly un­der­stand EA re­lated top­ics;

    2. Act on that knowl­edge to do the most good; and

    3. Stay on top of the col­lec­tive thought de­vel­op­ment.

  2. Bring­ing about more of (1) is good.

A de­cent defi­ni­tion of the EA net­work one would want to ‘build’ thus seems to be ‘the set of peo­ple and or­ga­ni­za­tions who seek to max­i­mally satisfy all three crite­ria of (1)’. And any­thing that re­motely re­sem­bles (2) has since been la­bel­led as ‘build­ing’:

  • CEA (incl. 80’000 Hours) ‘builds’ globally by try­ing to be the go-to or­ga­ni­za­tion for any­thing EA; cre­at­ing on­line con­tent; com­piling re­sources & knowl­edge; coach­ing and plac­ing in­di­vi­d­u­als; co­or­di­nat­ing with key or­ga­ni­za­tions; en­courag­ing dona­tions; sup­port­ing lo­cal groups & pro­jects; and host­ing Schel­ling events for the ac­tively en­gaged seg­ment of the global net­work. Some of these ser­vices are also offered by LEAN.

  • Other more ge­o­graph­i­cally fo­cused or­ga­ni­za­tions’ ac­tivi­ties of­ten ap­pear to be similarly broad: from small so­cial events; to fresh­ers’ fair booths; to ca­reer plan­ning and EA con­cepts work­shops; to per­sonal coach­ing; to rep­re­sent­ing EA at im­por­tant events; to or­ganis­ing large-scale con­fer­ences; to trans­lat­ing key re­sources into the lo­cal lan­guage; all the way to—yay, meta! - sup­port­ing other lo­cal groups.

Sort­ing these ac­tivi­ties into more self-ex­plana­tory cat­e­gories gives a bet­ter un­der­stand­ing of what ‘com­mu­nity build­ing’ en­tails. There seem to be roughly five differ­ent parts to ‘build­ing’:

  1. Marketing

  2. Coordination

  3. Recruitment

  4. Training

  5. Care

All five types of ac­tivity can take place, in vary­ing forms, at the global, or­ga­ni­za­tional, and com­mu­nity lev­els. Disen­tan­gling what is and should be hap­pen­ing how, why, when and where is an­other step in op­ti­mis­ing the co­or­di­na­tion of the EA net­work.

Pro­fes­sional ‘net­work de­vel­op­ment’

In CEA’s cur­rent ‘com­mu­nity build­ing’ grants round, the sole quan­tified met­ric to as­sess re­cip­i­ents is how many peo­ple they di­rect to­wards ‘pri­or­ity roles’. An un­der­stand­able met­ric, as it should prove value al­ign­ment and the short-term effec­tive­ness of a group.

De­spite ex­plic­itly leav­ing room for var­i­ous other achieve­ments by grantees, it skews CEA’s im­plicit defi­ni­tion of ‘com­mu­nity build­ing’ to­wards one sin­gle ac­tivity: re­cruit­ment. That’s not a bad thing at all—EA Geneva it­self agreed to CEA’s terms be­cause we think they make sense. It just does not have much to do with com­mu­nity.

And it is one of sev­eral poin­t­ers in a gen­eral di­rec­tion: or­ga­ni­za­tions with­out paid staff are gen­er­ally un­likely to do an im­pres­sive job at at least four, if not all five, of the ac­tivi­ties we defined: mar­ket­ing, co­or­di­na­tion, re­cruit­ment, and train­ing.

A look at the pri­vate sec­tor sug­gests that most of the short-term value that lo­cal groups and the like are cur­rently ex­pected to pro­duce is best reaped by spe­cial­ised pro­fes­sion­als.

Alter­na­tives to let­ting or­ga­ni­za­tions grap­ple with these ac­tivi­ties would be to cen­trally em­ploy/​or­ga­nize:

  • Pro­fes­sional rep­re­sen­tants (lob­by­ist-ish) to build in­fluence in rele­vant networks

  • Pro­fes­sional EA coaches to sup­port as­piring EAs in their work or studies

  • EA train­ing and sem­i­nars to skill up in­di­vi­d­u­als and in­sti­tu­tions to do more good

  • Re­cruit­ment tours through unis

  • Speaker tours through con­fer­ences, foun­da­tions and companies

  • Tar­geted out­reach to out­side ex­perts in re­lated fields for in­put and feedback

Th­ese roles are at­trac­tive also be­cause staff could be trained up and co­or­di­nated more effec­tively, pro­vid­ing much higher marginal re­turn than leav­ing it to each lo­cal com­mu­nity or or­ga­ni­za­tion.

The value from or­ga­ni­za­tions and com­mu­ni­ties will then come through ac­tivi­ties that re­quire (semi-)per­ma­nent lo­cal pres­ence. Lo­cal an­chor­age in rele­vant re­gions gives ac­cess to more net­works and will likely im­prove the EA net­work’s im­pact po­ten­tial.

Com­mu­nity building

The net­work can do a great job at con­vey­ing the drive of its mem­bers. But the roots of the ex­cite­ment most peo­ple ex­pe­rience at their first EA Global con­fer­ence are de­vel­oped by tight-knit com­mu­ni­ties that work on EA or­ga­ni­za­tions. What makes such com­mu­ni­ties?

Limits to com­mu­nity size

A well-known ob­jec­tion to EA claims that many peo­ple care more for their close cir­cle than the rest of the world. It seems to con­flate two types of care. One type re­gards moral pa­tient­hood and of­ten is meant to im­ply some form of re­spon­si­bil­ity. The other is the prac­ti­cal ques­tion of “who do I in­vest my re­la­tion­ship time in?”. Aspiring EAs some­times seem to make the same er­ror.

Car­ing for all be­ings in the ab­stract does not change that hu­mans are limited by their time and their num­ber of neo­cor­ti­cal neu­rons when it comes to those who they can take care of per­son­ally. In re­turn, the num­ber of peo­ple who can take care of them is similarly limited.

Luck­ily, it’s not nec­es­sary that ev­ery­one per­son­ally takes care of ev­ery­one else’s well-be­ing. A sense of moral con­cern is enough for EA com­mu­ni­ties of ~30 peo­ple (cf. size of bands) to co­or­di­nate effec­tively as a net­work.

Psy­cholog­i­cal safety as a key

To provide a true sense of com­mu­nity, mem­bers need to know that they are safe. This visceral trust is main­tained through strong re­la­tion­ships, struc­ture and par­ti­ci­pa­tion. You want peo­ple to re­ally know that you care for them? Well, ev­ery­one has to put in some quite a lot of work.

An alleged lack of psy­cholog­i­cal safety within the EA net­work ap­pears to put off even the­o­ret­i­cally hard­core EAs from tak­ing a job at cer­tain or­ga­ni­za­tions. We pos­tu­late that the un­der­ly­ing prob­lem is not, as of­ten put forth, some form of dis­crim­i­na­tion or a lack of di­ver­sity. In­stead, there seems to be a lack of the visceral sense of care and be­long­ing that no net­work can provide.

Cal­ling a net­work a ‘com­mu­nity’ cre­ates a false sense of be­long­ing that can only be up­set in the long run.

Nowa­days, most Western­ers can freely choose their pri­mary com­mu­nity. It does not even have to in­clude your biolog­i­cal rel­a­tives. That does not mean, how­ever, that it makes things eas­ier or al­lows for big­ger com­mu­ni­ties. On the con­trary, the ab­sence of cen­tury-long tra­di­tions only makes com­mu­nity build­ing more difficult, already on a small scale. Hu­mans are still just mon­keys.

A hu­man’s pri­mary com­mu­nity is not go­ing to be much big­ger than a band. The band has to ac­tively choose who to in­clude and in­clu­sion means com­mit­ment—which will be limited by the amount of time and en­ergy com­mu­nity mem­bers can in­vest.

Or­ga­ni­za­tion building

EA hubs, like Berkeley and Oxford, seem built around pro­fes­sional or­ga­ni­za­tions that have come out of very small com­mu­ni­ties (e.g. close friends, cou­ples, col­leagues).

An ex­pla­na­tion for small com­mu­ni­ties nour­ish­ing im­pact­ful or­ga­ni­za­tions could be that co-work­ing on your re­la­tion­ship(s) and/​or pro­jects brings peo­ple closer to­gether. Pro­fes­sion­al­iza­tion bonds even more in­tensely. Out­put gen­er­ated by work at­tracts other similarly ded­i­cated in­di­vi­d­u­als who of­ten search for just such a com­mu­nity.

A fo­cused and in­tense-yet-car­ing com­mu­nity can in­spire oth­ers to branch off and de­velop their own com­mu­nity or helps them to find their spot in an ex­ist­ing one. Dy­nam­ics which main­tain a vir­tu­ous, or at least self-sus­tain­ing cy­cle for hubs that have sur­passed a cer­tain crit­i­cal mass.

Or­ga­ni­za­tions with­out a paid com­mu­nity at their core seem much less im­pact­ful and sus­tain­able than or­ga­ni­za­tions built around a mis­sion and a core com­mu­nity. Of course, it is a hen-and-egg prob­lem, but un­der­stand­ing that seems im­por­tant to spot when­ever there is ei­ther a com­mu­nity that should be work­ing on some­thing or when there is some­thing to work on that could shape a com­mu­nity.

Groups of friends might want to try to figure out how they can work to­gether on a promis­ing pro­ject. Vol­un­teer or­ga­ni­za­tions should try to figure out whether they can build a com­mu­nity by figur­ing out what they could work on pro­fes­sion­ally. It’s the ul­ti­mate stress test and with the right sys­tems and out­put, it pro­vides im­mense learn­ing value and proof to the net­work.

Net­work development

The EA net­work has one mis­sion: do the most good. EA or­ga­ni­za­tions, com­mu­ni­ties and in­di­vi­d­u­als aim to max­i­mally con­tribute to that mis­sion, but they also have to make many trade-offs to take care of things that are only in­di­rectly re­lated to the core mis­sion (e.g.: main­tain­ing their health, ed­u­ca­tion, epistemic stan­dards and lo­cal in­te­gra­tion). The net­work has to leave room for adap­ta­tion to pre­serve its own re­silience but has to un­mis­tak­ably fo­cus on push­ing its core mis­sion by in­cen­tivis­ing max­i­mal con­tri­bu­tion.

To do bet­ter than just one com­mu­nity or or­gani­sa­tion, a net­work needs to co­or­di­nate well. To co­or­di­nate well, one needs to have the means to build al­ign­ment and pro­fes­sion­al­ize. To build al­ign­ment, com­mu­ni­ca­tion and train­ing sys­tems need to be in­stalled. To de­velop the nec­es­sary sys­tems, peo­ple need to un­der­stand the net­work and its com­po­nents.

To de­velop the net­work, EA needs peo­ple whose job it is to build the sys­tems nec­es­sary to effec­tively al­ign com­mu­ni­ties and or­ga­ni­za­tions. Essen­tially, these peo­ple would get paid to dis­pose of the nec­es­sary re­la­tion­ship time to be a part of two or more com­mu­ni­ties.

‘Net­work de­vel­op­ers’ then have dis­tinct re­spon­si­bil­ities from ‘com­mu­nity builders’. Com­mu­nity builders can fo­cus on max­imis­ing their com­mu­nity’s util­ity (and will do a bet­ter job if em­ployed by the com­mu­nity), while net­work de­vel­op­ers en­sure al­ign­ment main­te­nance when em­ployed by a ‘net­work de­vel­op­ment or­gani­sa­tion’. Both roles are then in­cen­tivised to co­op­er­ate op­ti­mally as the net­work’s im­pact and the com­mu­nity’s rele­vance de­pend on each other.

Conclusion

What we pro­pose could look some­thing like: CEA be­comes the ‘net­work de­vel­oper’, main­tain­ing the net­works qual­ity and in­cen­tivis­ing max­i­mum con­tri­bu­tion. LEAN fo­cuses on sup­port­ing ac­tual com­mu­ni­ties in de­vel­op­ing their mem­bers, teams and or­ga­ni­za­tions. Or­gani­sa­tions like the EA Stiftung, EA Czechia, or EA Lon­don sup­port CEA lo­cally by com­ple­ment­ing its efforts with ser­vices that are difficult to tai­lor to lo­cal re­al­ities from afar. Lo­cal groups de­velop the ca­pac­ity to op­ti­mally feed into the net­work de­pend­ing on their biggest pos­si­ble value add. Com­mu­ni­ties take care of its mem­bers on a hu­man level and build the badass, spe­cial­ised core teams that EA needs for max­i­mum im­pact work.

Of course, this is fully con­di­tional on find­ing the right peo­ple, com­mu­ni­ties and or­gani­sa­tions that should be part of the net­work. That’s a dis­cus­sion for an­other time.