EA Survey 2019 Series: Careers and Skills

Summary

  • The most pop­u­lar ca­reer paths that effec­tive al­tru­ists in the sur­vey (EAs) plan to fol­low are in earn­ing to give roles (38%) and work­ing at EA or­ga­ni­za­tions (37%).

  • 50% of EAs have only one planned broad ca­reer path.

  • Two of the top four sig­nifi­cant bar­ri­ers to be­com­ing more in­volved in EA were not enough job op­por­tu­ni­ties that seemed like a good fit for me (29%) and too hard to get an EA job (23%).

  • 462 (38%) EAs have at least 3 years work or grad­u­ate ex­pe­rience in the most pop­u­lar skills high­lighted as im­por­tant tal­ent needs for EA in a re­cent 80, 000 Hours/​CEA sur­vey of EA lead­ers.

  • 607 (30%) EAs have ap­plied to work at an EA or­ga­ni­za­tion or plan on work­ing in an EA or­ga­ni­za­tion, but don’t yet have ex­pe­rience in one.

  • 296 (15%) EAs in the sur­vey already work(ed) at an EA or­ga­ni­za­tion.

  • 1,014 (58%) EAs want to be­come more in­volved in the com­mu­nity by pur­su­ing a ca­reer in an EA-al­igned cause area.

  • AI Risk, Other X-Risk and Cli­mate Change are the most com­mon cause ar­eas EAs are in­ter­ested in pur­su­ing a ca­reer in.

  • AI Risk and Meta work are the cause ar­eas EAs are most of­ten cur­rently re­search­ing/​study­ing/​work­ing on.

  • Those who work(ed) at an EA or­ga­ni­za­tion are de­mo­graph­i­cally similar to other EAs.

This is a sup­ple­men­tary post to Re­think Char­ity’s main se­ries on the EA Sur­vey 2019, which pro­vides an an­nual snap­shot of the EA com­mu­nity. In this re­port, we ex­plore what ca­reer paths EAs are plan­ning to fol­low, their aca­demic and skills back­ground, and speci­fi­cally high­light is­sues around work­ing at an EA or­ga­ni­za­tion. In the first post we ex­plored the char­ac­ter­is­tics and ten­den­cies of EAs cap­tured in the sur­vey. In the sec­ond post we ex­plored cause prefer­ences of EAs. In fu­ture posts, we will ex­plore how these EAs first heard of and got in­volved in effec­tive al­tru­ism, their dona­tion pat­terns and ge­o­graphic dis­tri­bu­tion, among oth­ers.

Introduction

Us­ing your ca­reer to achieve im­pact is clearly a ma­jor in­ter­est of EA, with an en­tire or­ga­ni­za­tion, 80,000 Hours, ded­i­cated to this pur­suit. In this year’s EA Sur­vey we asked a num­ber of ques­tions about EAs’ ca­reer paths and skills. We think this will be of spe­cial in­ter­est to the EA com­mu­nity given the dis­cus­sion on pres­sures to work at EA or­ga­ni­za­tions, how hard it is to get these types of jobs, and the im­pe­tus to put more fo­cus on “ non-stan­dard ” EA ca­reers (much more dis­cus­sion here, here, and here). We can provide some data on whether bot­tle­necks lie in skills available in the tal­ent pool, the num­ber of op­por­tu­ni­ties available, or other ar­eas such as cause pri­ori­ti­za­tion mis­al­ign­ment. This may help de­cide where EA needs to de­velop the tal­ent that already ex­ists in the move­ment,where EA needs to grow new tal­ent, and how much of the bot­tle­necks lie in mis­matched skills ver­sus miss­ing skills.

Ca­reer plans

When asked about the top ways EAs are in­ter­ested in be­com­ing more in­volved in the com­mu­nity, the most com­mon re­sponse was Pur­sue a ca­reer in an EA-al­igned cause area (58%,1,014 EAs)[1] and a ma­jor­ity changed their ca­reer plans based on EA prin­ci­ples (51%, 1,025). How­ever, when asked about sig­nifi­cant bar­ri­ers to greater in­volve­ment in EA, two of the top four rea­sons among the 1,756 who an­swered this ques­tion were that there are not enough job op­por­tu­ni­ties that seemed like a good fit for me (29%, 514) and it is too hard to get an EA job (23%, 410).

We asked Which broad ca­reer path (s) are you plan­ning to fol­low? and pre­sented a list of op­tions from which re­spon­dents could choose as many as they liked. 1,877 EAs re­sponded with at least one of the ca­reer path op­tions.[2] How­ever, 50% (943) of these only re­ported one ca­reer path. That so many EAs don’t ap­pear to have al­ter­na­tive ca­reer plans may be one con­tribut­ing fac­tor to the sense of dis­illu­sion­ment sur­round­ing EA-al­igned job ap­pli­ca­tions and re­jec­tions.

A plu­ral­ity of re­sponses were for a planned ca­reer path in For-profit (earn­ing to give) (37.8%, 709). This is in­ter­est­ing given the dis­cus­sion in re­cent years about whether earn­ing to give should be the de­fault strat­egy for most EAs or if only a small pro­por­tion of peo­ple should earn to give long term (also see dis­cus­sions here and here). A similar share (36.6%, 688) of re­sponses were for Work at a non-profit (EA or­ga­ni­za­tion) which is gen­er­ally seen as a promis­ing ca­reer path for im­pact. There was over­lap be­tween these two groups, with 219 EAs se­lect­ing both. 21% (147) of those plan­ning to work at an EA or­ga­ni­za­tion and 40% (283) of those plan­ning an earn­ing to give role did not re­port any other planned ca­reer path. The most com­mon sec­ond ca­reer path of those aiming to work at an EA or­ga­ni­za­tion was Think tanks /​ lob­by­ing /​ ad­vo­cacy (242), while it was Work at a non-profit (EA or­ga­ni­za­tion) (219) for those plan­ning on earn­ing to give.

EA or­ga­ni­za­tion ca­reer status

Although it is just one of many promis­ing ca­reer paths, the goal of work­ing at an EA or­ga­ni­za­tion has at­tracted a lot of com­ment. In the data be­low we can see sig­nifi­cant num­bers of EAs are con­sid­er­ing this op­tion, far more than the num­ber of op­por­tu­ni­ties there ap­pear to be. We draw on data from two ques­tions in the sur­vey that al­lowed re­spon­dents to se­lect mul­ti­ple ac­tivi­ties they have en­gaged in or ca­reer paths they plan to fol­low.[3] There are 207 EAs in the sur­vey who claim to cur­rently work at an EA or­ga­ni­za­tion and far more who have ap­plied to work at, or plan to work at an EA or­ga­ni­za­tion. How­ever, these raw de­scrip­tive statis­tics are not very in­for­ma­tive with­out ac­count­ing for over­lap be­tween these cat­e­gories.

We grouped re­sponses for Worked at an EA or­ga­ni­za­tion and Cur­rently work at an EA or­ga­ni­za­tion to­gether into a sin­gle cat­e­gory of “work(ed) at an EA or­ga­ni­za­tion”. We clearly see in the venn di­a­gram be­low that the num­ber of peo­ple who have ap­plied to work at or ex­pect to fol­low a ca­reer path work­ing at an EA or­ga­ni­za­tion is huge com­pared to the ap­par­ent cur­rent num­ber of EA or­ga­ni­za­tion jobs (in­di­cated by the cur­rent num­bers work­ing in EA). This may serve to ex­plain why EA jobs are so com­pet­i­tive. 280 (32%) of the 887 EAs that have ap­plied for an EA job or plan to fol­low this ca­reer path also work(ed) at an EA or­ga­ni­za­tion.[4] 607 (30%) re­ported they have ap­plied for a job at an EA or­ga­ni­za­tion and/​or are plan­ning to fol­low a ca­reer path work­ing at an EA or­ga­ni­za­tion, but nei­ther cur­rently work nor pre­vi­ously worked at an EA or­ga­ni­za­tion. There are 54 EAs who work(ed) at an EA or­ga­ni­za­tion but didn’t re­port hav­ing ap­plied for such a job. This might re­flect that some peo­ple who work at EA or­ga­ni­za­tions are co-founders or are head­hunted with­out an ap­pli­ca­tion.

We can imag­ine nu­mer­ous ways of seg­ment­ing our data. One straight­for­ward ap­proach is to di­vide EAs be­tween those with ex­pe­rience in EA or­ga­ni­za­tions (“work(ed) at an EA or­ga­ni­za­tion”), those work­ing to­wards get­ting such ex­pe­rience (“can­di­dates”), and those not in­ter­ested or in­volved in EA or­ga­ni­za­tion work (“not in­ter­ested”).

  • Can­di­dates: 607 who re­ported they have ap­plied to work at, or plan to work at an EA or­ga­ni­za­tion but haven’t yet worked at one.

  • Work(ed) at an EA or­ga­ni­za­tion: 296 EAs in the sur­vey have worked or cur­rently work at an EA or­ga­ni­za­tion.

  • Not in­ter­ested: 1,610 have not ap­plied to, plan to work at, or already have ex­pe­rience work­ing at an EA or­ga­ni­za­tion.

Ta­lent & Skills

A re­cent CEA and 80,000 Hours sur­vey of EA lead­ers [5] asked about the tal­ents EA as a whole or their or­ga­ni­za­tion would need more of over the next 5 years. The most com­mon re­sponses in that sur­vey for EA as a whole were gov­ern­ment and policy (12.9% of all re­sponses) and man­age­ment (12.3%), while gen­er­al­ist re­searchers (9.4%), op­er­a­tions (8.9%), and man­age­ment (7.8%) were the main needs for re­spon­dents’ own or­ga­ni­za­tions.[6] “Ma­chine learn­ing /​ AI tech­ni­cal ex­per­tise” at­tracted ~7% for each. The per­centages and the figure be­low uses data from that sur­vey and high­light only the most pop­u­lar re­sponses that have a com­pa­rable cat­e­gory in the EA sur­vey.

In the 2019 EA Sur­vey, we asked re­spon­dents about ar­eas in which they had at least 3 years of work ex­pe­rience or grad­u­ate study. Re­spon­dents could se­lect mul­ti­ple op­tions. The top three re­sponses among the 1,206 who an­swered were soft­ware en­g­ineer­ing (27.2% of all re­sponses), man­age­ment (17.7%), and maths and statis­tics (16.7%), with the first be­ing clearly the most com­mon re­sponse. Only one of these (man­age­ment) ap­peared among the top tal­ent needs in the EA lead­ers fo­rum sur­vey.

There were 462 EAs (38%) who se­lected at least one of the most com­mon ar­eas cited in the lead­ers fo­rum sur­vey. 17.7% (214) re­ported hav­ing ex­pe­rience in man­age­ment,10.9% (131) in op­er­a­tions, 9.8% (118) in gov­ern­ment and policy, 6.3% (76) as gen­er­al­ist re­searchers, 8.5% (103) in Ma­chine Learn­ing, and 1% (13) in AI tech­ni­cal safety. One can­not make a di­rect com­par­i­son be­tween the two sur­veys be­cause the lead­ers fo­rum sur­vey re­flects the per­cent of lead­ers who think skill/​ex­pe­rience X is im­por­tant, but not what per­cent of EAs need have this skill/​ex­pe­rience. To the ex­tent that the share of EAs with ex­pe­rience in X re­flects the im­por­tance they place on these skills, one can com­pare the rank­ing of skills held by EAs to the rel­a­tive rank­ing of the skills needed by EA as a whole by EA lead­ers. In that case, EAs in the sur­vey ap­pear to un­der­em­pha­size ex­pe­rience in gov­ern­ment/​policy, gen­er­al­ist re­search and AI tech­ni­cal work, and per­haps overem­pha­size Ma­chine Learn­ing.

.

Cer­tain sub­groups in the EA sur­vey have these skills more so than the sam­ple as a whole. Due to the large num­ber of cat­e­gories, many with zero re­sponses among a sub­group, sig­nifi­cance test­ing the differ­ences would not offer very re­li­able es­ti­mates here. In­stead, we high­light some de­scrip­tive differ­ences.

42% of those plan­ning to fol­low an earn­ing to give ca­reer path had a back­ground in soft­ware en­g­ineer­ing (ver­sus 27% of the sam­ple as a whole) and 20% se­lected web de­vel­op­ment (ver­sus 13%). This was a “se­lect all” ques­tion so there was over­lap be­tween these two cat­e­gories. This makes sense given that these are typ­i­cally high-earn­ing jobs.Th­ese re­spon­dents were most in­ter­ested in be­com­ing more in­volved in EA by giv­ing more (69%) and pur­su­ing a ca­reer in an EA-al­igned cause area (57%).

As one would ex­pect given the his­tor­i­cal fo­cus of the site, mem­bers of LessWrong have more ex­pe­rience in Ma­chine Learn­ing (14% ver­sus 9%) and tech­ni­cal AI safety (4% ver­sus 1%) than the typ­i­cal EA. EAs with self-re­ported high lev­els of en­gage­ment in EA most of­ten re­sponded with ex­pe­rience in man­age­ment (25% ver­sus 18%), op­er­a­tions (22% ver­sus 11%), and did so more than the sur­vey as a whole. That more highly en­gaged EAs tend to have the skills most val­ued by EA lead­ers com­pared to other EAs is not sur­pris­ing given that a sug­gested crite­rion for choos­ing high en­gage­ment was work­ing at an EA or­ga­ni­za­tion.

There is a clear gen­der di­vide in terms of skills and ex­pe­rience. The most com­mon re­sponse among women was man­age­ment while the most com­mon re­sponse among men was soft­ware en­g­ineer­ing. The top three skills/​ex­pe­riences that fe­male EAs re­ported were man­age­ment (19% ver­sus 18% for men), gov­ern­ment/​policy (15% ver­sus 8% for men), and/​or op­er­a­tions (14% ver­sus 10% for men), while male EAs most of­ten re­ported soft­ware en­g­ineer­ing (33% ver­sus 10% for women), maths (19% ver­sus 11% for women), and/​or man­age­ment (% as above).

EA or­ga­ni­za­tion employees

As an­other proxy for the de­mand and sup­ply of tal­ent, we can ex­plore the ex­pe­rience and skills of those who work(ed) at EA or­ga­ni­za­tions com­pared to other EAs. Those who work(ed) in EA or­ga­ni­za­tions are most likely to have 3 years of work/​grad­u­ate ex­pe­rience in man­age­ment (31% ver­sus 18% for the sam­ple as a whole) and op­er­a­tions (24% ver­sus 11%). For “can­di­dates” the most com­mon re­sponses were man­age­ment (21% ver­sus 18%) and soft­ware en­g­ineer­ing (21% ver­sus 27%). Those not in­ter­ested in EA or­ga­ni­za­tion ca­reers most of­ten had ex­pe­rience in soft­ware en­g­ineer­ing (35%) and/​or maths and statis­tics (19% ver­sus 17%). The figure be­low high­lights only the most com­mon re­sponses and those high­lighted in the EA lead­ers fo­rum sur­vey and in­cludes the group of EAs not pur­su­ing or in­volved in jobs at EA or­ga­ni­za­tions.

Those who work(ed) at an EA or­ga­ni­za­tion ap­pear to have ex­pe­rience in AI tech­ni­cal safety and gen­er­al­ist re­search more so than “can­di­dates”, who them­selves ap­pear to more of­ten have gov­ern­ment and policy ex­pe­rience than those who work(ed) at an EA or­ga­ni­za­tion. Similar shares of each group ap­pear to have ex­pe­rience in soft­ware en­g­ineer­ing and maths and statis­tics. If one were to as­sume the share of those who work(ed) at an EA or­ga­ni­za­tion re­flects the need for a cer­tain skill, this might sug­gest that those cur­rently in the EA or­ga­ni­za­tion “job pool” are un­der­skil­led in the ar­eas of man­age­ment, op­er­a­tions, move­ment build­ing, and gen­er­al­ist re­search, while there ap­pears to be an over­sup­ply of can­di­dates with gov­ern­ment and policy ex­pe­rience and ma­chine learn­ing.

Geo­graphic dis­tri­bu­tion of skills

We will ex­plore the ge­o­graphic dis­tri­bu­tion of EAs in de­tail in forth­com­ing post, but here we can look at the skills/​ex­pe­rience dis­tri­bu­tion among the coun­tries most EAs live in (USA, UK, Ger­many, Aus­tralia, Canada, Rest of the World). Given that a plu­ral­ity of EAs live in the USA, it is not sur­pris­ing that in raw num­bers the USA tops the list in each skill cat­e­gory here, al­though the gaps might not be as large as some ex­pect. It is also in­ter­est­ing to ex­plore what per­cent of EAs in a coun­try have each skill/​ex­pe­rience. The ta­bles be­low high­light in dark green which coun­try had the largest per­cent/​num­ber of its EAs with a given ex­pe­rience, and the low­est in dark red. For ex­am­ple, 7% of EAs liv­ing in Canada had gen­er­al­ist re­searcher ex­pe­rience, higher than the share in the other coun­tries listed. How­ever, in ab­solute num­bers there were more EAs with gen­er­al­ist re­searcher ex­pe­rience in the USA than Canada (28 ver­sus 6).

A larger share of EAs liv­ing in Canada (7.1%), the UK (5.8%), and Aus­tralia (4.6%) have gen­er­al­ist re­searcher ex­pe­rience than among those in the USA (3.7%).EAs based in Canada ap­pear dis­pro­por­tionately more ex­pe­rienced in Ma­chine Learn­ing (8.3%) and op­er­a­tions (10.7%) than among EAs based in other coun­tries. AI tech­ni­cal ex­pe­rience is markedly ab­sent or low among most EAs, re­gard­less of coun­try of res­i­dence. Only 1.2% of the USA- and Canada-based EAs have such ex­pe­rience. Aus­tralia-based EAs ap­pear to be more ex­pe­rienced in gov­ern­ment and policy (14.5%) than EAs based el­se­where. ~9 to 13% of EAs in the top coun­tries, and “Rest of the World” have ex­pe­rience in man­age­ment.

Time in EA

Given the growth in the EA move­ment, we might ex­pect that vet­eran EAs would be ex­pe­rienced in the skills em­pha­sized in pre­vi­ous years, while newer EAs would be ex­pe­rienced in the skills be­ing em­pha­sized to­day such as op­er­a­tions and man­age­ment. This of course as­sumes that vet­eran EAs have not been gain­ing new ex­pe­riences in line with the chang­ing needs of the move­ment, which may not be true. It is also worth bear­ing in mind that those who joined EA in 2019 were younger on av­er­age than the sam­ple as a whole but join­ing EA at an older age than pre­vi­ous co­horts. It clearly is more difficult for younger peo­ple to have as much ex­pe­rience as older peo­ple. To the ex­tent that newer EAs also tend to be younger we might ex­pect newer EAs to have less ex­pe­rience in those ar­eas which are higher up the lad­der of ex­pe­rience (it is likely eas­ier to get gen­er­al­ist re­searcher ex­pe­rience when you are younger than a po­si­tion in gov­ern­ment or policy). The figures be­low chart the num­ber of EAs and their ex­pe­rience by year they joined EA, and the share of that year’s EAs and their ex­pe­rience, ex­clud­ing any­one un­der the age of 23.

The share of EAs with ex­pe­rience in gov­ern­ment and policy is rel­a­tively steady across the co­horts (with the ex­cep­tion of those who joined in 2011 top­ping the list at 11%), but the largest num­ber of EAs with such ex­pe­rience are those who joined EA 2-3 years ago. This then could re­flect that the newest EAs have yet to build up such ex­pe­rience and that in­di­vi­d­u­als who already have gov­ern­ment/​policy ex­pe­rience have not been join­ing the move­ment in re­cent years. Similarly, while the share of EAs with op­er­a­tions ex­pe­rience is higher among vet­ern EAs, the largest num­ber of EAs with such ex­pe­rience are those who joined EA 2-4 years ago. A greater share of vet­eran EAs have ex­pe­rience in Ma­chine Learn­ing than newer EAs, al­though in ab­solute num­bers newer EAs make up the bulk of those with Ma­chine Learn­ing ex­pe­rience. There doesn’t ap­pear to be a trend of newer EAs be­ing more ex­pe­rienced in AI tech­ni­cal safety than vet­eran EAs, with very few EAs hav­ing this ex­pe­rience at all. The num­ber of EAs with man­age­ment ex­pe­rience de­creases with time in EA while the share in­creases, al­though there ap­pears to be a plateau and drop-off in ab­solute num­bers among EAs who joined af­ter 2016. A greater share of vet­erean EAs have ex­pe­rience as gen­er­al­ist re­searchers than new EAs, but the trend in the ab­solute num­ber has been ir­reg­u­lar and has also started drop­ping off among EAs who joined af­ter 2016.

Do EA or­ga­ni­za­tion em­ploy­ees pri­ori­tize differ­ent cause ar­eas to EAs in gen­eral?

As men­tioned above, the most com­mon top way EAs in the sur­vey are in­ter­ested in be­com­ing more in­volved in the com­mu­nity was Pur­sue a ca­reer in an EA-al­igned cause area. There­fore, it seems worth­while to ex­am­ine how the cause pri­ori­ti­za­tion of EAs differs by ca­reer sta­tus and path.

The table be­low shows the mean rat­ing for each cause, [7] with the causes re­ceiv­ing the high­est rat­ing per group high­lighted in dark green, and the cause a group gave the low­est rat­ing to high­lighted in dark red. Both those who work(ed) at an EA or­ga­ni­za­tion and “can­di­dates” tend to pri­ori­tize AI Risk and Cause Pri­ori­ti­za­tion the most. Can­di­dates ap­pear to pri­ori­tize Cli­mate Change and Global Poverty more than those who work(ed) at an EA or­ga­ni­za­tion.[8] More strik­ing is the differ­ence in av­er­age cause pri­ori­ti­za­tion be­tween those who work(ed) at an EA or­ga­ni­za­tion and the rest of the sam­ple. Those who work(ed) at an EA or­ga­ni­za­tion tend to place a higher pri­or­ity on X-Risk, Meta, AI Risk, Cause Pri­ori­ti­za­tion, An­i­mal Welfare, and Biose­cu­rity than other EAs, who in turn place a higher pri­or­ity on Global Poverty, Cli­mate Change and Men­tal Health.[9] It is un­clear here whether EA or­ga­ni­za­tion ca­reer paths are more at­trac­tive to those in­ter­ested in these causes, or EAs in­ter­ested in these ca­reer paths al­ter their cause prefer­ences to al­ign with a per­ceived cause prefer­ence. It could also be that there is a cor­re­la­tion be­tween en­gage­ment in EA, work­ing in an EA or­ga­ni­za­tion, and cause pri­ori­ti­za­tion. As one com­par­i­son, AI Risk (both short and long timelines) and Global Health were among the causes ar­eas EA lead­ers thought the high­est per­centages of re­sources should be de­voted to over the next five years.

When asked what cause area EAs in gen­eral are cur­rently work­ing on, Im­prov­ing Ra­tion­al­ity/​De­ci­sion Mak­ing/​Science is the most pop­u­lar re­sponse among all EAs, fol­lowed by AI Risk, Men­tal Health, and Global Poverty. Among those with work ex­pe­rience in EA or­ga­ni­za­tions, the most pop­u­lar causes are Meta, AI Risk, and An­i­mal Welfare.

When asked what cause area they are in­ter­ested in pur­su­ing a ca­reer in, Im­prov­ing Ra­tion­al­ity/​De­ci­sion Mak­ing/​Science is the most pop­u­lar re­sponse among all EAs, fol­lowed by AI Risk and Global Poverty. Among those with work ex­pe­rience in EA or­ga­ni­za­tions, the most pop­u­lar causes are AI Risk, Other X-risk, and Cli­mate Change.

Diver­sity among EA or­ga­ni­za­tion employees

Over­all, EAs in the sur­vey con­tinue to be most of­ten male, white, well-ed­u­cated, and be­tween the ages of 25-34 and many EA or­ga­ni­za­tions are aiming to in­crease the di­ver­sity and in­clu­sion of their staff. Those who work(ed) at an EA or­ga­ni­za­tion ap­pear to be on av­er­age ~2 years younger than other EAs (29 ver­sus 31).[10] 17% of those who work(ed) at an EA or­ga­ni­za­tion have at­tended a top 20 uni­ver­sity, com­pared to 20% of other EAs. Based on our data, there were no sig­nifi­cant as­so­ci­a­tions be­tween work­ing at an EA or­ga­ni­za­tion or not and gen­der or race (the split was heav­ily and similarly skewed to­wards males and those iden­ti­fy­ing as white in both groups).[11] Those who work(ed) at an EA or­ga­ni­za­tion are more likely to be veg*n (veg­e­tar­ian or ve­gan) than other EAs, how­ever this is likely due to those be­ing more en­gaged in EA be­ing both more likely to be veg*n and to be work­ing at an EA or­ga­ni­za­tion.

Pre­dic­tors of EA or­ga­ni­za­tion careers

What are the char­ac­ter­is­tics as­so­ci­ated with be­ing more likely to ap­ply for a job in an EA or­ga­ni­za­tion? What are the char­ac­ter­is­tics as­so­ci­ated with be­ing more likely to work at an EA or­ga­ni­za­tion, con­di­tional upon hav­ing ap­plied?

A com­plete model of who ap­plies for jobs at EA or­ga­ni­za­tions is doubtless be­yond the scope of the data available in the EA sur­vey, how­ever we can look at some of the in­di­ca­tors that are likely to play a role. In their write-up of this ca­reer path, 80,000 Hours note that these or­ga­ni­za­tions gen­er­ally look for peo­ple who are already en­gaged with the com­mu­nity in some way such as at­tend­ing an EA Global, and can demon­strate their abil­ity to do the work well such as helping to run a lo­cal EA group, pub­lish­ing con­tent on the EA fo­rum, and/​or have work or grad­u­ate ex­pe­rience in the rele­vant field. It has been noted that spend­ing re­sources on ap­pli­ca­tions is eas­ier for those with a fi­nan­cial run­way and a sup­port net­work. There is also an im­pres­sion that re­cruit­ment con­figu­ra­tions may fa­vor elite (& highly priv­ileged) ap­pli­cants. Some EA/​ EA-al­igned or­ga­ni­za­tions, like the Open Philan­thropy Pro­ject, have dis­cussed that in some cases the ap­pli­cant pool was not as di­verse as de­sired, and that need­ing visas (es­pe­cially for the USA where many EA or­ga­ni­za­tions are based) is a difficulty.

We there­fore ex­plored a lo­gis­tic re­gres­sion model of ap­ply­ing for an EA or­ga­ni­za­tion job against data in the sur­vey that cor­re­sponded to these fac­tors as best we could, and in­cluded some fur­ther con­trols such as years in EA and cur­rent em­ploy­ment sta­tus. The model es­sen­tially asks if we as­sumed there was no real re­la­tion­ship be­tween these fac­tors and the prob­a­bil­ity of ap­ply­ing for a job, how sur­pris­ing would the data we have be? This can be used to up­date one’s pri­ors about any hy­poth­e­sized re­la­tion­ships. This model ex­plained ~26% of var­i­ance. [12]

The figure above plots the es­ti­mated odds ra­tios and their 95% con­fi­dence in­ter­vals of fac­tors that had p-val­ues <0.05, sug­gest­ing this data would be very sur­pris­ing to see in a world where there was no as­so­ci­a­tion be­tween the vari­ables. Th­ese are more straight­for­ward to in­ter­pret than re­gres­sion co­effi­cients. For ex­am­ple, one can in­ter­pret it as some­one who has at­tended an EA Global is ~2.9 times more likely have ap­plied than some­one who has not at­tended.

The model re­sults sug­gest that in our sam­ple an EA was more likely to have ap­plied to an EA or­ga­ni­za­tion job if they had com­pleted or were in the pro­cess of com­plet­ing a PhD, had at­tended EA Global, had a Master’s or Bach­e­lor’s, re­ceived 80,000 Hours coach­ing, posted on the EA Fo­rum, are an EA Fo­rum mem­ber, cur­rently live in the USA, are veg*n, and are a woman. An EA in the sur­vey was less likely to have ap­plied for an EA or­ga­ni­za­tion job if they were a stu­dent, had a high in­come (keep­ing in mind that some EAs in the sur­vey are ex­tremely high earn­ers and pre­sum­ably in lu­cra­tive earn­ing to give roles), and were older ( for ex­am­ple, a 40 year old was ~15% less likely to have ap­plied com­pared to a 20 year old). We did not find re­sults that would be sur­pris­ing given the null hy­poth­e­sis above for race, years in EA, hav­ing 3 years ex­pe­rience in any field,be­ing an EA lo­cal group leader, hav­ing or­ga­nized an EA event, be­ing a mem­ber of LessWrong, EA Face­book, an EA lo­cal group, GWWC, full time em­ployed, or be­ing re­tired.

It is im­por­tant to keep in mind that we do not have chronolog­i­cal data on when some­one ap­plied. For ex­am­ple, while there is an as­so­ci­a­tion with cur­rently liv­ing in the USA, we don’t know how many were liv­ing in the USA when they ap­plied. We can only sug­gest that those who ap­plied are more likely to cur­rently live in the USA than el­se­where, which is the case for a plu­ral­ity of EAs in the sam­ple.

How does this model perform when we look at whether or not a re­spon­dent work(ed) at an EA or­ga­ni­za­tion, con­di­tional upon them hav­ing ap­plied?

The same model ex­plains ~28% of var­i­ance in choos­ing work(ed) at an EA org, con­di­tional upon hav­ing ap­plied, how­ever far fewer fac­tors ap­pear to play a sig­nifi­cant role. In the figure above we again plot odds ra­tios, but also in­clude fac­tors which were not statis­ti­cally sur­pris­ing (those where 95% con­fi­dence in­ter­val bars cross the dashed line) as a com­par­i­son to the model be­fore. The model re­sults sug­gest that in our sam­ple an EA was more likely to work(ed) at an EA or­ga­ni­za­tion if they also had or­ga­nized an EA event and were an EA fo­rum mem­ber. An EA in the sur­vey was less likely to work(ed) at an EA or­ga­ni­za­tion if they were a mem­ber of a lo­cal group, pre­sum­ably be­cause they were not able to keep up mem­ber­ship in a lo­cal group in ad­di­tion to their EA job re­spon­si­bil­ities. For the other fac­tors in the model there were not statis­ti­cally sig­nifi­cant differ­ences.[13]

Credits

The an­nual EA Sur­vey is a pro­ject of Re­think Char­ity with anal­y­sis and com­men­tary from re­searchers at Re­think Pri­ori­ties.

This es­say was writ­ten by Neil Dul­laghan with con­tri­bu­tions from David Moss. Thanks to Derek Foster, Saulius Sim­cikas, and Peter Hur­ford for com­ments.

If you like our work, please con­sider sub­scribing to our newslet­ter. You can see all our work to date here.

Other ar­ti­cles in the EA Sur­vey 2019 Series can be found here



  1. 67 ad­di­tional re­spon­dents se­lected the op­tion “NA”, as dis­tinct from sim­ply not an­swer­ing the ques­tion. Th­ese re­sponses were ex­cluded from this anal­y­sis. ↩︎

  2. Of course not the same as work­ing for an EA or­ga­ni­za­tion ↩︎

  3. 1,991 EAs re­sponded to the ques­tion Which of the fol­low­ing ac­tivi­ties have you ever done? From which the “ap­plied to/​worked at/​cur­rently work at” re­sponses are drawn. 1,792 EAs re­sponded to the ques­tion If you had to guess, which broad ca­reer path (s) are you plan­ning to fol­low? from which the “plan to work” re­sponses are drawn. ↩︎

  4. One should not in­ter­pret this as 32% of those in the EA job pipeline are in­ter­nal. We do not have chronolog­i­cal data to dis­t­in­guish be­tween those who are cur­rently ap­ply­ing/​ex­pect­ing an EA or­ga­ni­za­tion ca­reer path and those who already ap­plied or to ex­pect to con­tinue on the ca­reer path they already have in an EA or­ga­ni­za­tion. ↩︎

  5. Note that that there are some con­cerns ex­pressed in the com­ments sec­tion of that post that this sur­vey did not con­tain a rep­re­sen­ta­tive sam­ple of EA lead­ers. ↩︎

  6. Re­spon­dents could se­lect up to 6 op­tions. There were other skills listed that do not have a di­rectly com­pa­rable dat­a­point in the EA Sur­vey. For ex­am­ple, “The abil­ity to re­ally figure out what mat­ters most and set the right pri­ori­ties”. ↩︎

  7. Con­vert­ing each of these op­tions into a nu­mer­i­cal poin­ton a five point scale (rang­ing from (1) ‘I do not think any re­sources should be de­voted to this cause’, to (5) ‘This cause should be the top pri­or­ity’). We recog­nise that the mean of a Lik­ert scale as a mea­sure of cen­tral ten­dency has limited mean­ing in in­ter­pre­ta­tion. Nev­er­the­less, it’s un­clear that re­port­ing the means is a worse solu­tion than other op­tions the team dis­cussed. ↩︎

  8. Sim­ple bi­vari­ate or­dered lo­gis­tic re­gres­sions sug­gests those in EA jobs gave higher re­sponses more for X-risk, Meta, AI Risk, Cause Pri­ori­ti­za­tion, An­i­mal welfare, and Biose­cu­rity, and lower re­sponses for Global Poverty, Men­tal Health, and Cli­mate Change (where p<0.05 and odds ra­tios range from 1.5 to 2.7). ↩︎

  9. Sim­ple bi­vari­ate or­dered lo­gis­tic re­gres­sions sug­gests those who work(ed) at an EA or­ga­ni­za­tion are 1.4 to 2.7 times more likely to pri­ori­tize the former group of causes than other EAs, and other EAs are 1.2 to 3.3 times more likely to pri­ori­tize the lat­ter group of causes than those who work(ed) at an EA or­ga­ni­za­tion (where p<0.05 ). ↩︎

  10. Welch’s t-test sug­gests a statis­ti­cally sur­pris­ing differ­ence in means (p<0.00001), a differ­ence of 1.9 years (Co­hen’s d 0.19), as­sum­ing a null hy­poth­e­sis of no differ­ence in means an alpha of 0.05, with 80% power to de­tect an effect of 0.186 or greater. ↩︎

  11. For gen­der a chi-squared test of as­so­ci­a­tion found Pear­son chi2 (1) = 0.1332 Pr = 0.715, with power of 95% to de­tect a small effect size. For race_white, a chi-squared test of as­so­ci­a­tion found Pear­son chi2 (1) = 0.5489 Pr = 0.459, with power of 95% to de­tect a small effect size. Based on sam­ple sizes of 1,745 and 1,760 re­spec­tively and Co­hen sug­gested “small” w value of 0.1. ↩︎

  12. We do not think these mod­els are the fi­nal word on who is more likely to ap­ply for or have an EA or­ga­ni­za­tion job, but for the sake of space and sim­plic­ity not all re­search av­enues could be ex­plored here. There are some is­sues any­one seek­ing to build on this work should keep in mind. The effects dis­cussed are “un­cor­rected” in the sense that an ad­justed alpha level for mul­ti­ple com­par­i­sons was not used. We used a vari­able for hav­ing 3 or more years work or grad­u­ate ex­pe­rience in any listed cat­e­gory rather than dummy vari­ables for each ex­pe­rience. This was pri­mar­ily for two rea­sons, 1) with­out know­ing what job a re­spon­dent was ap­ply­ing to we do not know which ex­pe­riences are rele­vant, and 2) lo­gis­tic re­gres­sions tend to iden­tify those fea­tures that are heav­ily bi­ased as sig­nifi­cant which may be the case for some cat­e­gories with only a few re­sponses. Similar con­cerns sur­round in­clud­ing any in­for­ma­tion about cause pri­ori­ti­za­tion data and sub­jects stud­ied. The log of in­come ap­peared statis­ti­cally sig­nifi­cant in the model of ap­plied to work at an EA or­ga­ni­za­tion, but de­scrip­tive statis­tics sug­gested the as­so­ci­a­tion may be curvil­in­ear, such that the low­est and high­est earn­ing EAs are less likely to ap­ply than mid­dle-range earn­ers. ↩︎

  13. Given our sam­ple size of 309 in this model, our alpha level of 0.05, and a power of 80%, only odds ra­tios of 0.69 or more ex­treme could be statis­ti­cally sig­nifi­cant, and our ob­served odds ra­tios for all other fac­tors in the model were smaller than this, so we could not re­ject the null hy­poth­e­sis for them. ↩︎

No comments.