It looks like the number of survey respondents dropped by about 14% from last year. Before someone else asks whether this represents a shrinking movement, Iâd like to share my view:
This was a weird year for getting people to participate in things. People were spending more time online, but Iâd guess they also had less energy/âdrive to do non-fun things for altruistic reasons, especially if those things werenât related to the pandemic. I suspect that if we were to ask people who ran similar surveys, weâd often see response counts dropping to a similar degree.
Since this time last year, participation metrics are up across the board for many EA things â more local groups, much more Forum activity, more GiveWell donors, a much faster rate of growth for Giving What We Can, etc.
Hence, I donât see the lower survey response count as a strong sign of movement shrinkage, so much as a sign of general fatigue showing up in the form of survey fatigue. (I think it was shared about as widely as it was last year, but differential sharing might also have mattered if that was a thing.)
You probably already agree with this, but I think lower survey participation should make you think itâs more likely that the effective altruism community is shrinking than you did before seeing that evidence.
If you as an individual or CEA as an institution have any metrics you track to determine whether effective altruism is growing or shrinking, Iâd find it interesting to know more about what they are.
Since this time last year, participation metrics are up across the board for many EA things â more local groups, much more Forum activity, more GiveWell donors, a much faster rate of growth for Giving What We Can, etc.
I do agree that lower survey participation is evidence in favor of a smaller community â I just think itâs overwhelmed by other evidence.
The metrics I mentioned were the first that came to mind. Trying to think of more:
From what Iâve seen at other orgs (GiveDirectly, AMF, EA Funds), donations to big EA charities seem to generally be growing over time (GD is flat, the other two are way up). This isnât the same as ânumber of people in the a EA movementâ, but in the case of EA Funds, I think âmonthly active donorsâ are quite likely to be people whoâd think of themselves in that way.
EA.org activity is also up quite a bit (pageviews up 35% from Jan 1 - May 23, 2021 vs. 2020, avg. time on page also up slightly).
Are there any numbers that especially interest you, which I either havenât mentioned or have mentioned but not given specific data on?
Just for the record, I find the evidence that EA is shrinking or stagnating on a substantial number of important dimensions pretty convincing. Relevant metrics include traffic to many EA-adjacent websites, Google trends for many EA-related terms, attendance at many non-student group meetups, total attendance at major EA conferences, number of people filling out the EA survey, and a good amount of community attrition among a lot of core people I care a lot about.
I think in terms of pure membership, I think EA is probably been pretty stable with some minor growth. I think itâs somewhat more likely than not that average competence in members has been going down, because new members donât seem as good as the members who Iâve seen leave.
It seems very clear to me that growth is much slower than it was in 2015-2017, based on basically all available metrics. The obvious explanation of âsometime around late 2016 lots of people decided that we should stop pursuing super aggressive growthâ seems like a relatively straightforward explanation and explains the data.
Do you have data on this across many meetups (or even just a couple of meetups in the Bay)?
I could easily believe this is happening, but Iâm not aware of whatever source the claim comes from. (Also reasonable if it comes from e.g. conversations youâve had with a bunch of organizers â just curious how you came to think this.)
total attendance at major EA conferences
This seems like much more a function of âhow conferences are planned and marketedâ than âhow many people in the world would want to attendâ.
In my experience (though I havenât checked this with CEAâs events team, so take it with a grain of salt), EA Global conferences have typically targeted certain numbers of attendees rather than aiming for as many people as possible. This breaks down a bit with virtual conferences, since itâs easier to âfitâ a very large number of people, but I still think the marketing for EAG Virtual 2020 was much less aggressive than the marketing for some of the earliest EAG conferences (and Iâd guess that the standards for admission were higher).
If CEA wanted to break the attendance record for EA Global with the SF 2022 conference, I suspect they could do so, but there would be substantial tradeoffs involved (e.g. between size and average conversation quality, or size and the need for more aggressivemarketing).
It seems very clear to me that growth is much slower than it was in 2015-2017, based on basically all available metrics. The obvious explanation of âsometime around late 2016 lots of people decided that we should stop pursuing super aggressive growthâ seems like a relatively straightforward explanation and explains the data.
I think we basically agree on this â I donât know that Iâd say âmuchâ, but certainly âslowerâ, and the explanation checks out. But I do think that growth is positive , based on the metrics Iâve mentioned, and that EA Survey response counts donât mirror that overall trend.
(None of this means that EA is doing anywhere near as well as it could/âshould be â I donât mean to convey that I think current trends are especially good, or that I agree with any particular decision of the âreduce focus on growthâ variety. I think Iâm quite a bit more pro-growth than the average person working full-time in âmeta-EAâ, though I havenât surveyed everyone about their opinions and canât say for sure.)
My personal non-data-driven impression is that things are steady overall. Contracting in SF, steady in NYC and Oxford, growing in London, DC. âlongtermismâ growing. Look forward to seeing the data!
Looking farther back at the data, numbers of valid responses from self-identified EAs: ~1200 in 2014,~2300 people in 2015, ~1800 in 2017 and then the numbers discussed here suggest that the number of people sampled has been about the same.
Comments:
Not sure about the jump from 2014 to 2015, Iâd expect some combination of broader outreach of GWWC, maybe some technical issues with the survey data (?) and more awareness of there being an EA Survey in the first place?
I was surprised that the overall numbers of responses has not changed significantly from 2015-2017. Perhaps it could be explained by the fact that there was no Survey taken in 2016?
I would also expect there to be some increase from 2015-2020, even taking into account Davidâs comment on the survey being longer. But there are probably lots of alternative explanations here.
I was going to try and compare the survey response to the estimated community size since 2014-2015, but realised that there donât seem to be any population estimates aside from the 2019 EA Survey. Are estimates on population size in earlier years?
Not sure about the jump from 2014 to 2015, Iâd expect some combination of broader outreach of GWWC, maybe some technical issues with the survey data (?) and more awareness of there being an EA Survey in the first place?
I think the total number of participants for the first EA Survey (EAS 2014) are basically not comparable to the later EA Surveys. It could be that higher awareness in 2015 than 2014 drives part of this, but there was definitely less distribution for EAS2014 (it wasnât shared at all by some major orgs). Whenever I am comparing numbers across surveys, I basically donât look at EAS 2014 (which was also substantially different in terms of content).
The highest comparability between surveys is for EAS 2018, 2019 and 2020.
I was surprised that the overall numbers of responses has not changed significantly from 2015-2017. Perhaps it could be explained by the fact that there was no Survey taken in 2016?
Appearances here are somewhat misleading, because although there was no EA Survey run in 2016, there was actually a similar amount of time in between EAS 2015 and EAS 2017 as any of the other EA Surveys (~15 months). But I do think itâs possible that the appearance of skipping a year reduced turnout in EAS 2017.
I was going to try and compare the survey response to the estimated community size since 2014-2015, but realised that there donât seem to be any population estimates aside from the 2019 EA Survey. Are estimates on population size in earlier years?
Weâve only attempted this kind of model for EAS 2019 and EAS 2020. To use similar methods for earlier years, weâd need similar historical data to use as a benchmark. EA Forum data from back then may be available, but it may not be comparable in terms of the fraction of the population itâs serving as a benchmark for. Back in 2015, the EA Forum was much more ânicheâ than it is now (~16% of respondents were members), so weâd be basing our estimates on a niche subgroup, rather than a proxy for highly engaged EAs more broadly.
I think that the reduction in numbers in 2019 and then again in 2020 is quite likely to be explained by fewer people being willing to take the survey due to it having become longer/âmore demanding since 2018. (I think this change, in 2019, reduced respondents a bit in the 2019 survey and then also made people less willing to take the 2020 survey.)
I think it was shared about as widely as it was last year, but differential sharing might also have mattered if that was a thing.
We can compare data across different referrers (e.g. the EA Newsletter, EA Facebook etc.) and see that there were fairly consistent drops across most referrers, including those that we know shared it no less than they did last year (e.g. the same email being sent out the same number of times), so I donât think this explains it.
We are considering looking into growth and attrition (using cross-year data) more in a future analysis.
Also note that because the drop began in 2019, I donât think this can be attributed to the pandemic.
In the world where changes to the survey explain the drop, Iâd expect to see a similar number of people click through to the survey (especially in 2019) but a lower completion rate. Do you happen to have data on the completion rate by year?
If the number of people visiting the survey has dropped, then that seems consistent with the hypothesis that the drop is explained by the movement shrinking unless the increased time cost of completing the survey was made very clear upfront in 2019 and 2020.
If the number of people visiting the survey has dropped, then that seems consistent with the hypothesis that the drop is explained by the movement shrinking unless the increased time cost of completing the survey was made very clear upfront in 2019 and 2020
Unfortunately (for testing your hypothesis in this manner) the length of the survey is made very explicit upfront. The estimated length of the EAS2019 was 2-3x longer than EAS2018 (as it happened, this was an over-estimate, though it was still much longer than in 2018), while the estimated length of EAS2020 was a mere 2x longer than EAS2018.
Also, I would expect a longer, more demanding survey to lead to fewer total respondents in the year of the survey itself (and not merely lagged a year), since I think current-year uptake can be influenced by word of mouth and sharing (I imagine people would be less likely to share and recommend others take the survey if they found the survey long or annoying).
That said, as I noted in my original comment, I would expect to see lag effects (the survey being too long reduces response to the next yearâs survey) and I might expect these effects to be larger (and to stack if the next yearâs survey is itself too long) and this is exactly what we see: we see a moderate did from 2018 to 2019 and then a much larger dip from 2019 to 2020.
Iâd expect to see a similar number of people click through to the survey (especially in 2019) but a lower completion rate
âCompletion rateâ is not entirely straightforward, because we explicitly instruct respondents that the final questions of the survey are especially optional âextra creditâ questions and they should feel free to quit the survey before these. We can, however, look at the final questions of the main section of the survey (before the extra credit section) and here we see roughly the predicted pattern: a big drop in those âcompletingâ the main section from 2018 to 2019 followed by a smaller absolute drop 2019 to 2020, even though the percentage of those who started the survey completing the main section actually increased between 2019 and 2020 (which we might expect if some people, who are less inclined to take the survey, were put off taking it).
Another (and I think better way) of examining whether we are simply sampling fewer people or the population has shrunk is comparing numbers for subpopulations of the EA Survey(s) to known population sizes as we did here.
In 2019, we estimated that we sampled around 40% of the âhighly engagedâ EA population. In 2020, using updated numbers, we estimated that we sampled around 35% of the highly engaged EA population.
If the true EA population had remained the same size 2019-2020 and we just sampled 35% rather than 40% overall, we would expect the number of EAs sampled in 2020 to decrease from 2513 to 2199 (which is pretty close to the 2166 we actually sampled).
But, as noted, we believe that we usually sample high and low engagement EAs at different rates (sampling relatively fewer less engaged EAs). And, if our sampling rate reduces overall, (a priori) I would expect this to hold up better among highly engaged EAs than among less engaged EAs (who may be less motivated to take the survey and more responsive to the survey becoming too onerous to complete).
The total number of highly engaged EAs in our sample this year was similar/âslightly higher than 2019, implying that population slightly increased in size (as I would expect). We donât have any good proxies for the size of the less engaged EA population (this becomes harder and harder as we consider the progressively larger populations of progressively less engaged EAs), but I would guess that we probably experienced a yet larger reduction in sampling rate for this population, and that the true size of the population of less engaged EAs has probably increased (I would probably look to things like the EA newsletter, subscribers to 80,000 Hours mailing lists and so on for proxies for that group, but Aaron/âpeople at 80K may disagree).
Of course, much of this growth in the number of highly engaged EAs is likely due to EAs becoming more engaged, rather than there becoming more EAs. As it happens, EAS2020 had more 4âČs but fewer 5âČs, which I think can plausibly be explained by the general reduction in rate of people sampled, mentioned above, but a number of 1-3s moving into the 4 category and fewer 4s moving into the 5 category (which is more stringent, e.g. EA org employee, group leader etc.).
Last year, we estimated the completion rate by surveying various groups (e.g. everyone who works at 80k) about who took the survey this year.
This showed that among highly engaged EAs, the response rate was ~40%, which let David make these estimates.
If we repeated that process this year, we could make a new estimate of the total number of EAs, which would give us an estimate of the growth /â shrinkage since 2019. This would be a noisy estimate, but one of the better methods Iâm aware of, so Iâd be excited to see this happen.
I kinda do this in this comment above: not estimating the total size again directly, but showing that I donât think the current numbers suggest a reduction in size.
It looks like the number of survey respondents dropped by about 14% from last year. Before someone else asks whether this represents a shrinking movement, Iâd like to share my view:
This was a weird year for getting people to participate in things. People were spending more time online, but Iâd guess they also had less energy/âdrive to do non-fun things for altruistic reasons, especially if those things werenât related to the pandemic. I suspect that if we were to ask people who ran similar surveys, weâd often see response counts dropping to a similar degree.
Since this time last year, participation metrics are up across the board for many EA things â more local groups, much more Forum activity, more GiveWell donors, a much faster rate of growth for Giving What We Can, etc.
Hence, I donât see the lower survey response count as a strong sign of movement shrinkage, so much as a sign of general fatigue showing up in the form of survey fatigue. (I think it was shared about as widely as it was last year, but differential sharing might also have mattered if that was a thing.)
You probably already agree with this, but I think lower survey participation should make you think itâs more likely that the effective altruism community is shrinking than you did before seeing that evidence.
If you as an individual or CEA as an institution have any metrics you track to determine whether effective altruism is growing or shrinking, Iâd find it interesting to know more about what they are.
He mentioned a number of relevant metrics:
From context, that appears to be an incomplete list of metrics selected as positive counterexamples. I assumed there are others as well.
I do agree that lower survey participation is evidence in favor of a smaller community â I just think itâs overwhelmed by other evidence.
The metrics I mentioned were the first that came to mind. Trying to think of more:
From what Iâve seen at other orgs (GiveDirectly, AMF, EA Funds), donations to big EA charities seem to generally be growing over time (GD is flat, the other two are way up). This isnât the same as ânumber of people in the a EA movementâ, but in the case of EA Funds, I think âmonthly active donorsâ are quite likely to be people whoâd think of themselves in that way.
EA.org activity is also up quite a bit (pageviews up 35% from Jan 1 - May 23, 2021 vs. 2020, avg. time on page also up slightly).
Are there any numbers that especially interest you, which I either havenât mentioned or have mentioned but not given specific data on?
Just for the record, I find the evidence that EA is shrinking or stagnating on a substantial number of important dimensions pretty convincing. Relevant metrics include traffic to many EA-adjacent websites, Google trends for many EA-related terms, attendance at many non-student group meetups, total attendance at major EA conferences, number of people filling out the EA survey, and a good amount of community attrition among a lot of core people I care a lot about.
I think in terms of pure membership, I think EA is probably been pretty stable with some minor growth. I think itâs somewhat more likely than not that average competence in members has been going down, because new members donât seem as good as the members who Iâve seen leave.
It seems very clear to me that growth is much slower than it was in 2015-2017, based on basically all available metrics. The obvious explanation of âsometime around late 2016 lots of people decided that we should stop pursuing super aggressive growthâ seems like a relatively straightforward explanation and explains the data.
Re: web traffic and Google trends â I think Peter Wildeford (nĂ©e Hurford) is working on an update to his previous post on this. Iâll be interested to see what the trends look like over the past two years given all the growth on other fronts. I would see continued decline/âstagnation of Google/âWikipedia interest as solid evidence for movement shrinkage/âstagnation.
Do you have data on this across many meetups (or even just a couple of meetups in the Bay)?
I could easily believe this is happening, but Iâm not aware of whatever source the claim comes from. (Also reasonable if it comes from e.g. conversations youâve had with a bunch of organizers â just curious how you came to think this.)
This seems like much more a function of âhow conferences are planned and marketedâ than âhow many people in the world would want to attendâ.
In my experience (though I havenât checked this with CEAâs events team, so take it with a grain of salt), EA Global conferences have typically targeted certain numbers of attendees rather than aiming for as many people as possible. This breaks down a bit with virtual conferences, since itâs easier to âfitâ a very large number of people, but I still think the marketing for EAG Virtual 2020 was much less aggressive than the marketing for some of the earliest EAG conferences (and Iâd guess that the standards for admission were higher).
If CEA wanted to break the attendance record for EA Global with the SF 2022 conference, I suspect they could do so, but there would be substantial tradeoffs involved (e.g. between size and average conversation quality, or size and the need for more aggressive marketing).
I think we basically agree on this â I donât know that Iâd say âmuchâ, but certainly âslowerâ, and the explanation checks out. But I do think that growth is positive , based on the metrics Iâve mentioned, and that EA Survey response counts donât mirror that overall trend.
(None of this means that EA is doing anywhere near as well as it could/âshould be â I donât mean to convey that I think current trends are especially good, or that I agree with any particular decision of the âreduce focus on growthâ variety. I think Iâm quite a bit more pro-growth than the average person working full-time in âmeta-EAâ, though I havenât surveyed everyone about their opinions and canât say for sure.)
I am also definitely interested in Peter Wildefordâs new update on that post, and been awaiting it with great anticipation.
My personal non-data-driven impression is that things are steady overall. Contracting in SF, steady in NYC and Oxford, growing in London, DC. âlongtermismâ growing. Look forward to seeing the data!
Looking farther back at the data, numbers of valid responses from self-identified EAs:
~1200 in 2014,~2300 people in 2015, ~1800 in 2017 and then the numbers discussed here suggest that the number of people sampled has been about the same.
Comments:
Not sure about the jump from 2014 to 2015, Iâd expect some combination of broader outreach of GWWC, maybe some technical issues with the survey data (?) and more awareness of there being an EA Survey in the first place?
I was surprised that the overall numbers of responses has not changed significantly from 2015-2017. Perhaps it could be explained by the fact that there was no Survey taken in 2016?
I would also expect there to be some increase from 2015-2020, even taking into account Davidâs comment on the survey being longer. But there are probably lots of alternative explanations here.
I was going to try and compare the survey response to the estimated community size since 2014-2015, but realised that there donât seem to be any population estimates aside from the 2019 EA Survey. Are estimates on population size in earlier years?
I think the total number of participants for the first EA Survey (EAS 2014) are basically not comparable to the later EA Surveys. It could be that higher awareness in 2015 than 2014 drives part of this, but there was definitely less distribution for EAS2014 (it wasnât shared at all by some major orgs). Whenever I am comparing numbers across surveys, I basically donât look at EAS 2014 (which was also substantially different in terms of content).
The highest comparability between surveys is for EAS 2018, 2019 and 2020.
Appearances here are somewhat misleading, because although there was no EA Survey run in 2016, there was actually a similar amount of time in between EAS 2015 and EAS 2017 as any of the other EA Surveys (~15 months). But I do think itâs possible that the appearance of skipping a year reduced turnout in EAS 2017.
Weâve only attempted this kind of model for EAS 2019 and EAS 2020. To use similar methods for earlier years, weâd need similar historical data to use as a benchmark. EA Forum data from back then may be available, but it may not be comparable in terms of the fraction of the population itâs serving as a benchmark for. Back in 2015, the EA Forum was much more ânicheâ than it is now (~16% of respondents were members), so weâd be basing our estimates on a niche subgroup, rather than a proxy for highly engaged EAs more broadly.
I think that the reduction in numbers in 2019 and then again in 2020 is quite likely to be explained by fewer people being willing to take the survey due to it having become longer/âmore demanding since 2018. (I think this change, in 2019, reduced respondents a bit in the 2019 survey and then also made people less willing to take the 2020 survey.)
We can compare data across different referrers (e.g. the EA Newsletter, EA Facebook etc.) and see that there were fairly consistent drops across most referrers, including those that we know shared it no less than they did last year (e.g. the same email being sent out the same number of times), so I donât think this explains it.
We are considering looking into growth and attrition (using cross-year data) more in a future analysis.
Also note that because the drop began in 2019, I donât think this can be attributed to the pandemic.
In the world where changes to the survey explain the drop, Iâd expect to see a similar number of people click through to the survey (especially in 2019) but a lower completion rate. Do you happen to have data on the completion rate by year?
If the number of people visiting the survey has dropped, then that seems consistent with the hypothesis that the drop is explained by the movement shrinking unless the increased time cost of completing the survey was made very clear upfront in 2019 and 2020.
Unfortunately (for testing your hypothesis in this manner) the length of the survey is made very explicit upfront. The estimated length of the EAS2019 was 2-3x longer than EAS2018 (as it happened, this was an over-estimate, though it was still much longer than in 2018), while the estimated length of EAS2020 was a mere 2x longer than EAS2018.
Also, I would expect a longer, more demanding survey to lead to fewer total respondents in the year of the survey itself (and not merely lagged a year), since I think current-year uptake can be influenced by word of mouth and sharing (I imagine people would be less likely to share and recommend others take the survey if they found the survey long or annoying).
That said, as I noted in my original comment, I would expect to see lag effects (the survey being too long reduces response to the next yearâs survey) and I might expect these effects to be larger (and to stack if the next yearâs survey is itself too long) and this is exactly what we see: we see a moderate did from 2018 to 2019 and then a much larger dip from 2019 to 2020.
âCompletion rateâ is not entirely straightforward, because we explicitly instruct respondents that the final questions of the survey are especially optional âextra creditâ questions and they should feel free to quit the survey before these. We can, however, look at the final questions of the main section of the survey (before the extra credit section) and here we see roughly the predicted pattern: a big drop in those âcompletingâ the main section from 2018 to 2019 followed by a smaller absolute drop 2019 to 2020, even though the percentage of those who started the survey completing the main section actually increased between 2019 and 2020 (which we might expect if some people, who are less inclined to take the survey, were put off taking it).
Another (and I think better way) of examining whether we are simply sampling fewer people or the population has shrunk is comparing numbers for subpopulations of the EA Survey(s) to known population sizes as we did here.
In 2019, we estimated that we sampled around 40% of the âhighly engagedâ EA population. In 2020, using updated numbers, we estimated that we sampled around 35% of the highly engaged EA population.
If the true EA population had remained the same size 2019-2020 and we just sampled 35% rather than 40% overall, we would expect the number of EAs sampled in 2020 to decrease from 2513 to 2199 (which is pretty close to the 2166 we actually sampled).
But, as noted, we believe that we usually sample high and low engagement EAs at different rates (sampling relatively fewer less engaged EAs). And, if our sampling rate reduces overall, (a priori) I would expect this to hold up better among highly engaged EAs than among less engaged EAs (who may be less motivated to take the survey and more responsive to the survey becoming too onerous to complete).
The total number of highly engaged EAs in our sample this year was similar/âslightly higher than 2019, implying that population slightly increased in size (as I would expect). We donât have any good proxies for the size of the less engaged EA population (this becomes harder and harder as we consider the progressively larger populations of progressively less engaged EAs), but I would guess that we probably experienced a yet larger reduction in sampling rate for this population, and that the true size of the population of less engaged EAs has probably increased (I would probably look to things like the EA newsletter, subscribers to 80,000 Hours mailing lists and so on for proxies for that group, but Aaron/âpeople at 80K may disagree).
If the sampling rate of highly engaged EAs has gone down from 40% to 35%, but the number of them was the same, that would imply 14% growth.
You then say:
So the total growth should be 14% + growth in highly engaged EAs.
Could you give me the exact figure?
926 highly engaged in EAS2019, 933 in EAS2020.
Of course, much of this growth in the number of highly engaged EAs is likely due to EAs becoming more engaged, rather than there becoming more EAs. As it happens, EAS2020 had more 4âČs but fewer 5âČs, which I think can plausibly be explained by the general reduction in rate of people sampled, mentioned above, but a number of 1-3s moving into the 4 category and fewer 4s moving into the 5 category (which is more stringent, e.g. EA org employee, group leader etc.).
Last year, we estimated the completion rate by surveying various groups (e.g. everyone who works at 80k) about who took the survey this year.
This showed that among highly engaged EAs, the response rate was ~40%, which let David make these estimates.
If we repeated that process this year, we could make a new estimate of the total number of EAs, which would give us an estimate of the growth /â shrinkage since 2019. This would be a noisy estimate, but one of the better methods Iâm aware of, so Iâd be excited to see this happen.
I kinda do this in this comment above: not estimating the total size again directly, but showing that I donât think the current numbers suggest a reduction in size.