Looking farther back at the data, numbers of valid responses from self-identified EAs: ~1200 in 2014,~2300 people in 2015, ~1800 in 2017 and then the numbers discussed here suggest that the number of people sampled has been about the same.
Comments:
Not sure about the jump from 2014 to 2015, I’d expect some combination of broader outreach of GWWC, maybe some technical issues with the survey data (?) and more awareness of there being an EA Survey in the first place?
I was surprised that the overall numbers of responses has not changed significantly from 2015-2017. Perhaps it could be explained by the fact that there was no Survey taken in 2016?
I would also expect there to be some increase from 2015-2020, even taking into account David’s comment on the survey being longer. But there are probably lots of alternative explanations here.
I was going to try and compare the survey response to the estimated community size since 2014-2015, but realised that there don’t seem to be any population estimates aside from the 2019 EA Survey. Are estimates on population size in earlier years?
Not sure about the jump from 2014 to 2015, I’d expect some combination of broader outreach of GWWC, maybe some technical issues with the survey data (?) and more awareness of there being an EA Survey in the first place?
I think the total number of participants for the first EA Survey (EAS 2014) are basically not comparable to the later EA Surveys. It could be that higher awareness in 2015 than 2014 drives part of this, but there was definitely less distribution for EAS2014 (it wasn’t shared at all by some major orgs). Whenever I am comparing numbers across surveys, I basically don’t look at EAS 2014 (which was also substantially different in terms of content).
The highest comparability between surveys is for EAS 2018, 2019 and 2020.
I was surprised that the overall numbers of responses has not changed significantly from 2015-2017. Perhaps it could be explained by the fact that there was no Survey taken in 2016?
Appearances here are somewhat misleading, because although there was no EA Survey run in 2016, there was actually a similar amount of time in between EAS 2015 and EAS 2017 as any of the other EA Surveys (~15 months). But I do think it’s possible that the appearance of skipping a year reduced turnout in EAS 2017.
I was going to try and compare the survey response to the estimated community size since 2014-2015, but realised that there don’t seem to be any population estimates aside from the 2019 EA Survey. Are estimates on population size in earlier years?
We’ve only attempted this kind of model for EAS 2019 and EAS 2020. To use similar methods for earlier years, we’d need similar historical data to use as a benchmark. EA Forum data from back then may be available, but it may not be comparable in terms of the fraction of the population it’s serving as a benchmark for. Back in 2015, the EA Forum was much more ‘niche’ than it is now (~16% of respondents were members), so we’d be basing our estimates on a niche subgroup, rather than a proxy for highly engaged EAs more broadly.
Looking farther back at the data, numbers of valid responses from self-identified EAs:
~1200 in 2014,~2300 people in 2015, ~1800 in 2017 and then the numbers discussed here suggest that the number of people sampled has been about the same.
Comments:
Not sure about the jump from 2014 to 2015, I’d expect some combination of broader outreach of GWWC, maybe some technical issues with the survey data (?) and more awareness of there being an EA Survey in the first place?
I was surprised that the overall numbers of responses has not changed significantly from 2015-2017. Perhaps it could be explained by the fact that there was no Survey taken in 2016?
I would also expect there to be some increase from 2015-2020, even taking into account David’s comment on the survey being longer. But there are probably lots of alternative explanations here.
I was going to try and compare the survey response to the estimated community size since 2014-2015, but realised that there don’t seem to be any population estimates aside from the 2019 EA Survey. Are estimates on population size in earlier years?
I think the total number of participants for the first EA Survey (EAS 2014) are basically not comparable to the later EA Surveys. It could be that higher awareness in 2015 than 2014 drives part of this, but there was definitely less distribution for EAS2014 (it wasn’t shared at all by some major orgs). Whenever I am comparing numbers across surveys, I basically don’t look at EAS 2014 (which was also substantially different in terms of content).
The highest comparability between surveys is for EAS 2018, 2019 and 2020.
Appearances here are somewhat misleading, because although there was no EA Survey run in 2016, there was actually a similar amount of time in between EAS 2015 and EAS 2017 as any of the other EA Surveys (~15 months). But I do think it’s possible that the appearance of skipping a year reduced turnout in EAS 2017.
We’ve only attempted this kind of model for EAS 2019 and EAS 2020. To use similar methods for earlier years, we’d need similar historical data to use as a benchmark. EA Forum data from back then may be available, but it may not be comparable in terms of the fraction of the population it’s serving as a benchmark for. Back in 2015, the EA Forum was much more ‘niche’ than it is now (~16% of respondents were members), so we’d be basing our estimates on a niche subgroup, rather than a proxy for highly engaged EAs more broadly.