Heuristics from Running Harvard and Oxford EA Groups

This post is co-au­thored by Aleš Flídr and James Aung. We thank Harri Besceli for helpful com­ments. Our friend To­bias also wrote an ex­cel­lent post with a lot of over­lap that we highly recom­mend check­ing out.

In our ex­pe­rience lead­ing Har­vard EA and Oxford EA, we’ve made a lot of failures and there­fore have a fair num­ber of tips we would give to our past selves.

Note that the heuris­tics de­scribed be­low were built up in the con­text of English-speak­ing uni­ver­si­ties. Some points will not gen­er­al­ize to re­gional/​na­tional groups or other cul­tures. We ex­pect read­ers to be able to judge that for them­selves.

Nei­ther of us has any ma­jor dis­agree­ments with the cur­rent CEA strat­egy or CEA’s mod­els of com­mu­nity build­ing. We think that the heuris­tics be­low can serve as a good com­ple­ment to these high-level strate­gic thoughts.

Some un­der­ly­ing be­liefs:

  • The ma­jor­ity of value will come from a few in­di­vi­d­u­als . As with most other groups, a typ­i­cal stu­dent group will draw a dis­pro­por­tionate amount of value from a rel­a­tively small num­ber of deeply en­gaged mem­bers. Most of the coun­ter­fac­tual value is go­ing to come from the smaller num­ber of deeply im­mersed and en­gaged peo­ple in your com­mu­nity.
  • Long-ter­mism. Most of the value of our ac­tions will be de­ter­mined by their im­pact on the long-run tra­jec­tory of hu­man­ity (and non-hu­man sen­tient be­ings). In the con­text of com­mu­nity build­ing, this im­plies a rel­a­tively stronger fo­cus on so­cial and epistemic norms and peo­ple who can pre­serve/​im­prove them.

  • The Long-term im­pact of ideas. The long-term im­pact of EA will be largely de­ter­mined by the qual­ity of our ideas. We should there­fore fo­cus on high-fidelity meth­ods of com­mu­ni­ca­tion.

What fol­lows is a set of heuris­tics that we have built up over years. We think that these al­ign rel­a­tively well with the un­der­ly­ing as­sump­tions, CEA’s cur­rent big-pic­ture strat­egy and our on-the-ground ex­pe­rience.

Most of these heuris­tics come from the ac­cu­mu­la­tion of a lot of anec­do­tal ev­i­dence, rather than sys­tem­atic data-driven anal­y­sis. With these caveats in mind, here are the heuris­tics:

  1. Fo­cus on your un­der­stand­ing of EA. There is no sub­sti­tute for hav­ing de­tailed mod­els and broad knowl­edge of ev­ery­thing rele­vant to EA. Peo­ple will base their un­der­stand­ing of effec­tive al­tru­ism from you so make sure that you are as well versed in the liter­a­ture as pos­si­ble and that you can “cite your sources”.

  2. De­fault to 1:1′s. In hind­sight, it is some­what sur­pris­ing that 1:1 con­ver­sa­tions are not the de­fault stu­dent group ac­tivity. They have a num­ber of benefits: you get to know peo­ple on a per­sonal level, you can pre­sent in­for­ma­tion in a nu­anced way, you can tai­lor recom­mended re­sources to in­di­vi­d­ual in­ter­ests etc. Proac­tively reach out to mem­bers in your com­mu­nity and offer to grab a coffee with them or go for a walk. 1:1′s also give you a good yard­stick to eval­u­ate how valuable longer pro­jects have to be to be worth ex­e­cut­ing: e.g. a 7-hour pro­ject would have to be at least as valuable as 7 1:1′s, other things equal. Caveat: we definitely don’t mean to im­ply that you should cut all group or larger-scale ac­tivi­ties. We will share some ideas for such ac­tivi­ties in a fol­low-up post.

  3. Avoid naive EA out­reach.

    1. Outreach is an offer, not per­sua­sion. It can be tempt­ing to try and per­suade as many peo­ple about EA and run events that tweak the mes­sage of EA in an at­tempt to ap­peal to cer­tain peo­ple. From our ex­pe­rience, this is gen­er­ally a dan­ger­ous ap­proach as it leads to low-fidelity diluted or gar­bled mes­sages. In­stead, think about out­reach efforts as an ‘offer’ of EA where peo­ple can get a taste of what it’s about and take it or leave it. It’s OK if some­one’s not in­ter­ested. A use­ful heuris­tic James used for test­ing whether to run an out­reach event is to ask “to what ex­tent would the au­di­ence mem­ber now know whether effec­tive al­tru­ism is an idea they would be in­ter­ested in”. It turned out that many speaker events that Oxford were run­ning didn’t fit this test, and nei­ther did the fundrais­ing cam­paign.

    2. Don’t “in­tro­duce EA”. It’s fine if peo­ple don’t come across EA ideas in a par­tic­u­lar se­quence First, find en­try points that cap­ture a per­son’s in­ter­est. If some­one finds EA in­ter­est­ing and likes the com­mu­nity, they will ab­sorb the ba­sics pretty soon.

  4. Don’t teach, sign­post. Avoid the temp­ta­tion to teach EA to peo­ple. There’s a lot of great on­line con­tent, and you won’t be able to ex­plain the same ideas as well or in as much nu­ance as longform writ­ten con­tent, well-pre­pared talks or pod­cast epi­sodes. In­stead of view­ing your­self as a teacher of EA, think of your­self as a sign­post. Be able to point peo­ple to in­ter­est­ing and rele­vant ma­te­rial on all ar­eas of EA, and re­move fric­tion for peo­ple learn­ing more by proac­tively recom­mend­ing them con­tent. For ex­am­ple, af­ter a 1:1 meet­ing, mes­sage over 3 links that are rele­vant to their cur­rent bot­tle­neck/​area of in­ter­est.

  5. En­gage­ment is more im­por­tant than wide-reach. En­gage­ment with com­mu­nity and re­sources was typ­i­cally a much bet­ter pre­dic­tor of the value that an in­di­vi­d­ual brings to the com­mu­nity than the im­pres­sive­ness of their CV.

    1. Fo­cus on ca­reers, rather than di­rect im­pact or fundrais­ing. Get­ting peo­ple in your stu­dent com­mu­nity to make progress to­wards a high-value ca­reer path seems like the high­est value thing you can be out­putting. Stu­dents don’t have a lot of money or skills, so you won’t be able to do much good with di­rect work or fundrais­ing in a stu­dent group. 80k’s sur­vey re­vealed that a me­dian ju­nior hire would be worth $250k to a typ­i­cal EA org.

    2. Plan changes dom­i­nate. The ‘re­sults’ your lo­cal group can de­liver vary widely in ex­pected im­pact: a GWWC pledge is much more im­pact­ful than a dona­tion to a fundraiser, a ca­reer-plan-change for a pri­or­ity path is sig­nifi­cantly more im­pact­ful than a GWWC pledge (by 80,000 Hours’ IASPC met­ric). Given this, if peo­ple already in your group aren’t mak­ing ca­reer plans it’s more im­por­tant to work out how you can en­courage them to do so, rather than try­ing to get more peo­ple into your group.

    3. Op­ti­mize con­tent for the most en­gaged mem­bers. A good heuris­tic for find­ing use­ful things to do is to just ask the most en­gaged mem­bers of the com­mu­nity what they would find most valuable. Send Face­book mes­sages to 10 peo­ple ask­ing “what things would you find valuable for us to run for you?”

    4. Try to make the com­mu­nity fun and at­trac­tive. Hav­ing a fun so­cial at­mo­sphere in your com­mu­nity en­courages peo­ple to keep on ex­plor­ing EA and mo­ti­vates peo­ple to take ac­tion. Be the one to sug­gest so­cial ac­tivi­ties and in­tro­duce peo­ple to each other.

    5. Be­ware ex­ces­sive for­mal­ism. For­mal team struc­tures tend to just repli­cate what’s been done the pre­vi­ous year. A bet­ter model for a team is a tight knit group of ‘con­spir­a­tors’. Also be­ware of get­ting bogged down in mean­ingless ad­min as a sub­sti­tute for learn­ing more about EA.

    6. Develop a toolkit of ques­tions. You want to help peo­ple get as en­gaged as they want and help them skill up as much as you can, but we of­ten do this by lec­tur­ing at peo­ple and push­ing ideas. A more fruit­ful strat­egy is to be able to ask the right ques­tions that en­courage peo­ple to ex­plore and en­gage fur­ther. For more in­for­ma­tion of how to get peo­ple to reach novel in­sights or change their mind see David Rock’s ex­cel­lent books Your Brain at Work and Quiet Lead­er­ship (Yes, we know that this ar­ti­cle goes against this ad­vice, this ap­proach is harder in writ­ing). Also con­sider at­tend­ing a CFAR work­shop (Ham­ming ques­tions are par­tic­u­larly use­ful).

In a fu­ture post, we will share a cou­ple of pro­jects com­pat­i­ble with these heuris­tics that worked par­tic­u­larly well for our groups.