EA Survey 2018 Series: How Long Do EAs Stay in EA?
Not everyone who joins the effective altruism community stays around forever. Some people value drift, some people leave altogether, and some people continue to do EA-aligned things but choose to withdraw from the community. Better understanding how and why people leave EA is important for assessing our overall community health and impact, but getting reliable data on this can be very hard to do.
As of Giving What We Can’s 2014 Impact Analysis, they had noted that 1.7% of members leave each year and an additional 4.7% of members “go silent” each year, meaning that GWWC has not been able to hear a response from them after two years (doing roughly annual check-ins), with a total number of people ceasing donations at 5.8%. This would suggest ~74% of people involved in GWWC remain involved after a five year time period.
This number forms a good starting point, but it is fairly out of date, having been collected five years ago. It also may not be representative of the overall EA community, since being an active part of EA is a high time cost but may not require any financial costs, whereas pledging to donate 10% has a high financial cost, but may not require much time cost beyond sending in a check and returning an email once a year or so.
To try to address this question from a different perspective, we try to use data from three EA surveys run in 2018, 2017, and 2015.
Longitudinal Data by Email Address
People who take the EA Survey each year can optionally give their email. If they do so, their email is securely encrypted and stored as a way to track whether they take the EA Survey in multiple years while preserving anonymity. As long as people give the same email each year, we can look at how many people who take prior surveys come back and take the survey again, showing they continue to remain part of the EA community.
We can look at this data in Table 1, where we see that of the people who took the survey in 2015, identified as EA, and gave their email address, 16% of them were still in the survey and using the same email address for the 2017 EA Survey, while 15% of them did so when taking the 2018 EA Survey three years later. 27% of people who gave their email address in the 2017 EA Survey came back for the 2018 EA Survey with the same email address.
Longitudinal Data by Comparing Samples
The above analysis is made difficult due to the possibility that people may come back in future surveys but decide not to give an email address or give a different address. Moreover, people may come back but have not given an email address in past years or any year. To try to track the population more holistically regardless of email address, we instead can try to compare populations.
Specifically, any EA who took the EA Survey in a particular year must have joined EA on that year or earlier. So we can then look at future EA Surveys and see when people stated they joined EA and compare that back to the original totals and see how that compares. This is what we see in Figure 1, which plots the number of people who took the EA Survey in each year on blue bars and then compares the number of people who in 2018 report joining EA in a particular year on the orange line. We can see that the orange line dips below the total number of respondents for the 2015 and 2014 EA Surveys, showing that a large number of people who must have joined EA in 2015 or earlier are no longer reporting that in future surveys, meaning they most likely dropped out of the EA movement or at least dropped out of the population of EAs that fills out the EA Survey.
This is more accurate than email tracking in that it captures more people (such as those who didn’t give an email or those who changed emails), but less accurate in that it is possible that people who state they joined EA earlier could still show up just on later surveys and offset people who dropped off, making the retention rate appear higher than it actually is.
We can see the data behind this graph in Table 2. This data suggests a retention rate around 60%.
Conclusion
Longitudinal EA Survey data potentially provides a new source of data on EA retention, tracking how people engage with the EA movement and how that changes over time. This data shows that roughly ~60% of EAs still stay around after 4-5 years. Data based on emails shows a much lower retention rate of ~16% after 4-5 years, but is likely less accurate due to people merely changing email addresses. The original GWWC data suggests a ~74% retention rate over the same time period, but is based on annualizing and extrapolating a single year trend which likely makes for an overestimate.
However, there is still a lot of room for interpretation—despite our best attempts to distribute the survey widely, the population of EAs may still be different from those who actively engage with the EA movement, and even those who actively engage with the movement may still fail to fill out the EA Survey. It’s also difficult to benchmark retention data when we don’t have comparative data for other movements to know what is normal.
Coda
This post is part of the supplementary posts for the EA Survey 2018 Series. The annual EA Survey is a project of Rethink Charity with analysis and commentary from researchers at Rethink Priorities.
This post was written by Peter Hurford with analysis by Peter Hurford and David Moss. Thanks to Tee Barnett and Marcus Davis for additional review and editing.
Other articles in the EA Survey 2018 Series include:
I—Community Demographics & Characteristics
II—Distribution & Analysis Methodology
III—How do people get involved in EA?
IV—Subscribers and Identifiers
VIII- Where People First Hear About EA and Higher Levels of Involvement
IX- Geographic Differences in EA
X- Welcomingness- How Welcoming is EA?
XII- Do EA Survey Takers Keep Their GWWC Pledge?
Prior EA Surveys include:
The 2017 Survey of Effective Altruists
The 2015 Survey of Effective Altruists: Results and Analysis
The 2014 Survey of Effective Altruists: Results and Analysis
If you like work from Rethink Priorities, please consider subscribing to our newsletter. You can see all our work to date here.
- A Qualitative Analysis of Value Drift in EA by 12 Feb 2020 5:41 UTC; 145 points) (
- More empirical data on ‘value drift’ by 29 Aug 2020 11:44 UTC; 114 points) (
- EA Survey 2019 Series: Community Information by 10 Jun 2020 16:26 UTC; 90 points) (
- EA Survey 2018 Series: Do EA Survey Takers Keep Their GWWC Pledge? by 16 Jun 2019 23:04 UTC; 87 points) (
- EA Survey 2018 Series: Donation Data by 9 Dec 2018 3:58 UTC; 83 points) (
- Estimating the Philanthropic Discount Rate by 3 Jul 2020 16:58 UTC; 81 points) (
- What new EA project or org would you like to see created in the next 3 years? by 11 Jun 2019 20:56 UTC; 75 points) (
- Rethink Priorities 2019 Impact and Strategy by 2 Dec 2019 16:32 UTC; 72 points) (
- EA Survey 2018 Series: Cause Selection by 18 Jan 2019 16:55 UTC; 69 points) (
- EA Survey 2018 Series: Geographic Differences in EA by 18 Feb 2019 23:34 UTC; 68 points) (
- 19 Apr 2022 0:36 UTC; 58 points) 's comment on FTX/CEA—show us your numbers! by (
- The case for investing to give later by 3 Jul 2020 15:23 UTC; 54 points) (
- Rethink Priorities Impact Survey by 21 Nov 2019 19:20 UTC; 54 points) (
- EA Survey 2018 Series: How welcoming is EA? by 28 Feb 2019 2:42 UTC; 50 points) (
- EA Survey Series 2018 : How do people get involved in EA? by 18 Nov 2018 0:06 UTC; 50 points) (
- Some 2021 CEA Retention Statistics by 9 Jul 2021 17:11 UTC; 45 points) (
- EA Survey 2019 Series: Engagement Levels by 23 Jun 2020 4:54 UTC; 42 points) (
- Retention in EA—Part I: Survey Data by 5 Feb 2021 19:09 UTC; 40 points) (
- EA Survey 2018 Series: Community Demographics & Characteristics by 21 Sep 2018 4:13 UTC; 39 points) (
- Latest EA Updates for June 2019 by 1 Jul 2019 10:07 UTC; 39 points) (
- EA Survey 2018 Series: Where People First Hear About EA and Influences on Involvement by 11 Feb 2019 6:05 UTC; 34 points) (
- EA Survey 2018 Series: Group Membership by 11 Feb 2019 6:04 UTC; 34 points) (
- EA Survey Series 2018: Subscribers and Identifiers by 26 Nov 2018 6:57 UTC; 30 points) (
- EA Survey 2018 Series: Distribution and Analysis Methodology by 18 Oct 2018 2:06 UTC; 15 points) (
- 7 Feb 2021 4:32 UTC; 12 points) 's comment on Retention in EA—Part III: Retention Comparisons by (
- 9 Jul 2021 22:42 UTC; 10 points) 's comment on Some 2021 CEA Retention Statistics by (
- 29 Aug 2020 13:08 UTC; 10 points) 's comment on More empirical data on ‘value drift’ by (
- On the margin, should EA focus on outreach or retention? by 31 May 2019 22:22 UTC; 5 points) (
- 6 Jul 2020 3:36 UTC; 3 points) 's comment on The case for investing to give later by (
- 13 Apr 2022 14:27 UTC; 1 point) 's comment on david_reinstein’s Quick takes by (
Thanks for pulling this together!
Do you think the real retention rate is closer to 74%, 60%, or 16%?
What weighting would you give each data source, if you were to aggregate them into a single point estimate of EA’s retention rate?
It’s hard to say for sure because each data source has its drawbacks, but I think I put the most weight on the “Longitudinal Data by Comparing Samples” method and would guess the real retention rate is close to 60%.
Got it, thanks.
I suppose the salient question then becomes: “why do 40% of folks who get excited about EA end up leaving after a few years?”
This is the key question and more research is needed.
Generally, my guess is that in most cases the answer is very mundane: people’s careers change, they graduate college and are no longer involved with other EAs, they get busier and don’t have time for EA engagement anymore, they have a kid and focus on raising them, they move towns and get involved with something else, and/or people’s fascinations just change even with no life changes, etc… it is easy to identify in a movement and then drift out to something else. For one anecdotal example, in 2009-2011 I strongly identified with the atheist movement but then drifted out due to an increasing lack of interest (and a newfound interest in EA).
It might be possible to use the email method to identify particular people who do not return and see how they compare to the population that does return, but I am nervous that these populations are just hopelessly confounded by people changing email addresses or general noise with small samples.
Insofar as we can, we could also try to ask people who don’t come back for their reasons, but these people by definition are hard to contact (since they left) and sampling will be likely biased. I know EA London made one attempt to do something like this but it felt fairly inconclusive.
Peter, when you drifted away from the atheist movement, do you feel like your values and beliefs changed, or was it more about unsubscribing from newsletters and prioritizing different blogs and events?
I feel like my values and beliefs changed to some extent. I no longer feel like reducing the influence of religion is an important thing to do.
I can’t speak for Peter, but I also drifted away from caring much about atheism/humanism. In my case, I found that EA gave me all the rationality and caring-about-people that I’d been looking for, without the discussion of religion or focus on religion-related issues (which often felt repetitive or low-impact). My values and beliefs didn’t change; I just found a better way to fulfill them.
This raises a question: Is there something EA gave some of the people who left, which they then found more of in some other place?
Point of clarification: Those focus groups were specifically focused on people who do attend events, not people who left.
My answer is even more mundane: maybe people don’t give up on the EA ideas, they just stop spending time on the EA census.
That doesn’t seem to apply so well as an explanation of the dropoff that GWWC found of a fairly similar magnitude. It seems less likely that people would find it too onerous to answer GWWC’s (if I recall, fairly simple) request for confirmation that they are still meeting a formal pledge they took.
To what extent are these the same populations? How many people who took the census also took the pledge?
32.4% of the 2018 EA Survey respondents had taken the pledge (see our post). As of 2018 it looks like GWWC had around just over 3000 members, which suggests we captured around 25-30% of the total membership (presumably a subset that is on average more engaged in EA). My impression is that many GWWC members were not particularly engaged with (and perhaps do not even identify with) effective altruism at all, so it’s no surprise that the total number of Pledge takers within our sample of EAs is smaller than the total population of Pledge takers.
I’m not sure what the implication of suggesting that these are different populations is though? My observation was that the possibility that people simply “stop taking the EA census” doesn’t seem to serve so well as an explanation of the dropoff that GWWC observe. Of course, it’s possible that people are dropping out of the GWWC Pledge (or at least contact with GWWC checking in on their Pledge) for unrelated reasons to people disappearing from the EA Survey, though it seems likely that there is some overlap given the relationship between GWWC and EA, but it remains the case that people simply ceasing to complete the EA Survey can’t explain away GWWC’s similar rate of dropoff and so it remains a possible concern.
Kudos for doing this!
Question: Do you have thoughts on what updates we can make with this information? Are there any changes you think EAs should make from these updates?
I think this work is interesting, but it’s not obvious to me what decisions it’s particularly useful for.
I agree that I don’t know what to do with this information, especially without rates from other movements to compare it to, but I think it is good to start the conversation.
I’d agree it’s a good start. Now that we have it, I would probably be particularly excited about next steps include further thinking and details of applications.
Why should the possibility of early EAs failing to take early surveys inflate the retention rate more than the possibility of early EAs failing to take later surveys deflate it? Shouldn’t we expect these two effects to roughly cancel each other out? If anything, I would expect EAs in a given cohort to be slightly less willing to participate in the EA survey with each successive year, since completing the survey becomes arguably more tedious the more you do it. If so, this methodology should slightly underestimate, rather than overestimate, the true retention rate. Apologies if I’m misunderstanding the reasoning here.
Yeah, I suppose you’re right. I guess the point is that the offset effect still makes it hard to estimate the true retention effect… not to mention any other differential non-response in survey taking.
Are you planning to update the analysis with data from the 2019 survey?
We aren’t currently planning on doing so—why?
I just thought it would be valuable to recalculate the estimated rates of attrition with this new data, though I think it’s totally fine for you to deprioritize this.