Error
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
Thanks for this post! I found it quite interesting and useful.
I feel like some parts of this post could give the (very likely inaccurate) impression that you/80,000 Hours thinks working at an EA organisation is distinctly better than essentially all other roles. Specifically, I think this might result from the post repeatedly talking about people who work(ed) at EA orgs, and whether they left, without as repeatedly/prominently talking about people working in other high-impact roles (e.g., in non-EA academia, politics, or AI labs).
I’m pretty sure that the real reasons why this post gives disproportionate attention to data from EA orgs are simply that:
It’s easier to get data about people who were working at EA orgs than about EAs in other high-impact roles
“High-impact roles” is a less clear-cut category, different people have different views, and “typical EA views” on that have changed in various ways over the last decade
And those seems to me to be good reasons for this post to be the way it is.
But I felt like it might be worth making those reasons explicit, to counter potential (very likely mistaken) inferences that Ben Todd/80k considers working at EA orgs to be essentially the most impactful thing to do. That’s because, historically, many people seem to have updated hard on what they perceived 80k to be saying, even when 80k didn’t mean that, including on this topic in particular. (And there are also other things that seem to bias EAs towards focusing overly much on roles at EA orgs, as discussed e.g. here.)
Hi Michael, I made some quick edits to help reduce this impression.
I also want to clarify that out of the 6 methods given, only 1 is about people working at EA organisations.
Is there even 1 exclusively about people working at EA organisations?
If someone had taken a different job with the goal of having a big social impact, and we didn’t think what they were doing was horribly misguided, I don’t think we would count them as having ‘dropped out of EA’ in any of the 6 data sets.
I was referring to things like phrasings used and how often someone working for an EA org vs not was discussed relative to other things; I wasn’t referring to the actual criteria used to classify people as having dropping out / reduced involvement or not.
Given that Ben says he’s now made some edits, it doesn’t seem worth combing through the post again in detail to find examples of the sort of thing I mean. But I just did a quick ctrl+f for “organisations”, and found this, as one example:
This is definitely not explicitly saying “not dropping out = working at an EA org”. Instead, I think it’s meant as something more like “There are many ways one can stay involved in EA, but in this case we had the obvious evidence that most of these people were still working at EA orgs, making it unnecessary to check if they were still involved in other ways.”
That said:
I think that, for various reasons that I mostly don’t pin on 80k[1], various people feel like working at an EA org is one of the most impactful and/or “EA-y” things to do, even if they don’t necessarily explicitly believe that. (I think you highlighted this well in your own recent post.) So it seems worth being extra careful about things that could accidentally exacerbate that feeling.
The text I quoted does sound like it categorises 2 of the first set of 14 people as “becoming less involved” because they’re not working at an EA org, without saying anything about whether they’re still doing potentially high-impact things.
On the other hand, it also suggests that 3 of the set of 24 people aren’t working at EA orgs but are still not considered to have not dropped out, which pushes against this.
Plus, it’s totally plausible Ben did consider whether those 2 people were doing other potentially high-impact things, found they seemed not to be (or not as much as they had been), and just didn’t mention that.
Also, to be clear, I didn’t mean my original comment as even a mild criticism of this post, really. I just thought it would be useful for this point to be explicitly made, to push against an impression some people might erroneously form after reading this post.
[1] To the extent to which it seems plausible that 80k has contributed to this phenomena, I don’t think it would’ve been easy for someone else to have done better. I think 80k has an unusual degree of prominence and respect in the EA community that makes it unusually likely that people will be influenced by 80k in ways that 80k didn’t intend, even if 80k is doing a well-above-average job of communicating carefully and with nuance. (And I indeed think 80k is doing a well-above-average job of that.)
FWIW, I did a quick meta-analysis in Stan of the adjusted 5-year dropout rates in your first table (for those surveys where the sample size is known). The punchline is an estimated true mean cross-study dropout rate of ~23%, with a 90% CI of roughly [5%, 41%]. For good measure, I also fit the data to a beta distribution and came up with a similar result.
I struggle with how to interpret these numbers. It’s not clear to me that the community dropout rate is a good proxy for value drift (however it’s defined), as in some sense it is a central hope of the community that the values will become detached from the movement—I think we want more and more people to feel “EA-like”, regardless of whether they’re involved with the community. It’s easy for me to imagine that people who drift out of the movement (and stop answering the survey) maintain broad alignment with EA’s core values. In this sense, the “core EA community” around the Forum, CEA, 80k, etc is less of a static glob and more of a mechanism for producing people who ask certain questions about the world.
Conversely, value drift within members who are persistently engaged in the community seems to be of real import, and presumably the kind of thing that can only be tracked longitudinally, by matching EA Survey respondents across years.
Hi Matt,
It’s cool you did that, though I wouldn’t recommend simply combining all the samples, since they’re for really different groups at very different levels of engagement (which leads to predictably very different drop out rates).
A quick improvement would be to split into a highly engaged and a broader group.
The highly engaged meta-analysis could include: Joey’s 50% donors; CEA weekend away highly engaged subset; 80k top plan changes; CEA early employees.
The broader meta-analysis could be based on: GWWC estimate; EA survey estimate; Joey 10% donors; CEA weekend away entire sample.
I’d be keen to see the results of this!
This is the reason for doing a random effects meta-analysis in the first place: the motivating assumption is that the populations across studies are very different and so are the underlying dropout rates (e.g. differing estimates are due not just to within-study variation but also to cross-study variation of the kind you describe).
Still, it was sloppy of me to describe 23% as the true estimate above- in RE, there is no true estimate. A better takeaway is that, within the scope of the kind of variation we see across these survey populations, we’d almost certainly expect to see dropout of less than 40%, regardless of engagement level. Perhaps straining the possibilities of the sample size, I ran the analysis again with an intercept for engagement—high engagement seems to be worth about 21 percentage points’ worth of reduced dropout likelihood on the 5-year frame.
>60% persistence in the community at large seems pretty remarkable to me. I understand that you haven’t been able to benchmark against similar communities, but my prior on youth movements (as I think EA qualifies) would be considerably higher. Do you have a reference class for the EA community in mind? If so, what’s in it?
Thank you, that’s helpful!
Do you mean 21 percentage points, so if the overall mean is 23%, then the most engaged are only 2%? Or does it mean 21% lower, in which case it’s 18%?
I’m not aware of a good reference class where we have data—I’d be keen to see more research into that.
It might be worth saying that doing something like taking the GWWC pledge is still a high level of engagement & commitment on the scale of things, and I would guess significantly higher than the typical young person affiliating with a youth movement for a while.
(The mean & median age in EA is also ~28 right now, so while still on the youthful side, it’s not mainly not students or teenagers.)
The former! This is pretty sensitive to modeling choices—tried a different way, I get an engagement effect of 31 percentage points (38% vs. 7% dropout).
The modeling assumption made here is that engagement level shifts the whole distribution of dropout rates, which otherwise looks the same; not sure if that’s justifiable (seems like not?), but the size of the data is constraining. I’d be curious to hear what someone with more meta-analysis experience has to say about this, but one way to approximate value drift via a diversity of measurements might be to pile more proxy measurements into the model—dropout rates, engagement reductions, and whatever else you can come up with—on the basis that they are all noisy measurements of value drift.
I’d be super curious to know if the mean/median age of EA right now is a function of the people who got into it as undergrads or grad students several years ago and who have continued to be highly engaged over time. Not having been involved for that long, I have no idea whether that idea has anecdotal resonance.
It would be super interesting to work on how to improve “retainment” with social integration. I was thinking that having a regular gather.town “mega meeting” of EAs may be pretty nice in times of confinement to promote social interactions, project collaborations, etc.
Do you have in mind that people who support more mainstream issues—like climate change or global health and development, rather than AI safety—are more likely to leave EA because they have more alternative options of people to talk to? Or the same prediction, but because of something else, like a focus on more distinctly EA issues being evidence of a “more distinctly EA mindset”?
Or do you have in mind that people who support issues that are perceived as “weirder” within EA—like anti-ageing or psychedelics research—are more likely to leave EA?
The comment on AI safety having become less weird prompted the following thought: Perhaps a (weak) argument that drop-out rates will increase in future is that:
Various EA focuses may become more mainstream (either because of EA’s success or for other reasons)
If that happens, EAs may be more likely to drift out of EA, as there’s no longer as much gained by them being in EA specifically (since there are now more non-EA people for them to talk to, collaborate with, work for, etc.)
But perhaps 1 and/or 2 are quite unlikely. Or perhaps we shouldn’t call that “drop -out” exactly, since the people would still be focused on issues we consider important (just not “under an EA banner”).
Thanks, I find this very useful!
I guess I would refine the”weird cause area” reason with adding that some EAs may leave because of strong disagreement with some EA mainstream or public figures’ views. For example, a few years ago climate change was not taken as an x-risk, and somewhat regularly dismissed, which would have put off a few longtermists. I know someone who left EA because of strong disagreement with how AI safety is handled—eg encouraging working for an organization that works on AGI development. Basically, I think that sometimes there is a “tipping point” for strong disagreement where some people leave. Ideally, EA would be able to strongly focus on “EA is a question, not an ideology” so that people who have informed different opinions still say in.
I suspect that burnout may also be another reason why people in EA orgs leave.
Cause preference (i.e. prioritising different causes than the EA community or thinking that the EA community focused too much on particular causes and ignored others) was the second most commonly cited reason among people who reported declining interest in EA.
Thank you, this list is a useful complement to this post.
Thanks for this useful summary!
Note that section 4 reiterates Peter Hurford’s analysis in a post from last year.
One possibility is to take a look at the top contributors to Felicifia, an early EA/utilitarian forum, and note how many are still around. Louis Francini kindly restored the original site earlier this year, which had been down for a long time, so this can be done very easily.
Ah sorry I meant to link to Peter Hurford’s analysis—I’ll add it now.
My understanding is that David/Rethink has a reasonably accurate model of this, i.e. they can predict how someone would respond to the engagement questions on the basis of how they answered other questions.
It might be interesting to try doing this to get data from prior years.
These are still the best data on community drop out I’m aware of.
Nice work with this!
One thing that comes to mind, (though perhaps a bit strange), is to really consider Effective Altruism under a similar lens as you would a SaaS product or similar. In the SaaS (software as a service) industry, there are a fair bit of best practices around understanding retention rates, churn, and doing cohort analysis and the like. There’s also literature in evaluating the quality of a product on NPS score and better. It could be neat to have people rank “Effective Altruism” and “The EA Community” on NPS scores.
Likewise, it could be interesting to survey people with things like, “How would you rate the value you are getting from the EA ecosystem”, and then work to maximize this value. Consider the costs (donations, career changes) vs. the benefits and see if you can model total value better.
Hey Ozzie, that makes sense. I think the last EA survey did some things pretty similar to this, inc. asking about value adds & issues, and something similar to the NPS score, as well as why people don’t recommend it.
Yeh much of this is in our Community Information post where we:
asked an ‘NPS’ question about EA, asked for qualitative information about positive/negative experiences of the community and examined predictors
asked about barriers to becoming more involved in EA
asked about reasons for people’s interest in EA declining or increasing
asked about what factors were important for retaining people in EA
asked about why people who the respondent knew dropped out
I’m pretty sceptical about the utility of Net Promoter Score in the classical sense for EA. I don’t think there’s any good evidence for the prescribed way of calculating Net Promoter Score (ignoring respondents who answer in the upper-middle of the scale, and then subtracting the proportion of people who selected one of the bottom 7 response levels from the proportion who selected one of the top two response items). And, as I mentioned in our original post, its validity and predictive power has been questioned. Furthermore, one of the most common uses is comparing the NPS score of an entity to an industry benchmark (e.g. the average scores for other companies in the same industry), but it’s very unclear what reference class would be relevant for EA, the community, as a whole, so it’s fundamentally not clear whether EA’s NPS score is good or bad. In the specific case of EA, I also suspect that the question of how excited one would be to recommend EA to a suitable friend may well be picking up on attitudes other than satisfaction with EA, i.e. literally how people would feel about recommending EA to someone. This might explain why the people with the highest ‘NPS’ scores (we just treated the measure as a straightforward ordinal varlable in our own analyses) were people who had just joined EA, and fairly reliably became lower over time.
Are you assuming quite short careers? Using bucket midpoints I calculate
Which suggests you are using ~24 years for a full career, which seems a little low. If I substitute 40 years I get over 30 years of engagement.
The answer does not change very much when I converted these numbers to annualised risk factors in excel (and assumed 100% dropoff at year 40).
I was doing a very hacky calculation—I’ll change to 30 years and mention your comment.