The survey is distributed with the help of a number of different EA orgs through a number of different channels (noted below). In this year’s EA Survey, the EA Forum was the largest referrer, followed by 80,000 Hours, and then via a sharing link for respondents to share the survey with others. In previous years the EA Newsletter has been more dominant and my impression is that a big factor is simply the timing of when different sources are released (e.g. many respondents would either see the survey via the Forum or the Newsletter because they read both, and will simply take the survey via whichever they see first).
source
n
proportion
EA Forum
1033
28.97%
80,000 Hours
743
20.84%
Shared (by respondent)
431
12.09%
No referrer
282
7.91%
E-mail (previously participated)
275
7.71%
EA groups
266
7.46%
EA Newsletter
216
6.06%
Other
141
3.95%
Dank EA Memes (Facebook)
121
3.39%
EA Facebook
31
0.87%
LessWrong
27
0.76%
One of the most important factors (which is unrelated to the particular referrers above) is that the survey recruits relatively more highly engaged EAs than people who are less engaged. In particular, we recruit many more people who are moderately-highly engaged (levels 3-5), and very few with lower engagement (e.g. have read EA materials for a few hours). I think this makes perfect sense, since we would expect people who are more engaged with the community to be more likely to take the survey.[1] Given this, you would likely want to consult the analyses we provide on differences between low/high engagement respondents to get a sense of the differences. Unfortunately, there’s no way to simply weight the results to get around this (as we could if we were doing a representative survey of the US public, for example), due to the simple fact that no-one knows what the real composition is of the full EA population including those with low engagement with EA.
The other area where recruitment methods are particularly significant is where the referrers to the survey are also categories of interest in the survey questions. For example, as we have discussed before, 80,000 Hours is a large referrer to the survey, which may influence responses to questions which mention 80K. In last year’s post, we therefore conducted an additional analysis, where we found that removing 80K-referred people, lowered the proportion of people selecting 80,000 Hours, though still let them among the most important sources (moving them from 50.7% to 45% of respondents). As we noted in that post though, simply looking at the proportion who were not referred to the survey from 80K (or other factor) is likely to be a wild over-correction for the fact that people influenced by that factor were more likely to be referred to the survey from that factor.
If there’s sufficient demand we could potentially produce another post looking at similar analyses, though my impression is that when we did so for last year’s EAS there wasn’t significant interest.
Though, as we note in the linked post, in practice, people may be primarily interested in the responses of those who are at least moderately engaged. Lower engagement would be someone who has e.g. “engaged with a few articles, videos, podcasts, discussions, events on effective altruism (e.g. reading Doing Good Better or spending ~5 hours on the website of 80,000 Hours)” but not necessarily engaged in any other way.
The most detailed discussion of this issue is in our bookdown from last year.
The survey is distributed with the help of a number of different EA orgs through a number of different channels (noted below). In this year’s EA Survey, the EA Forum was the largest referrer, followed by 80,000 Hours, and then via a sharing link for respondents to share the survey with others. In previous years the EA Newsletter has been more dominant and my impression is that a big factor is simply the timing of when different sources are released (e.g. many respondents would either see the survey via the Forum or the Newsletter because they read both, and will simply take the survey via whichever they see first).
One of the most important factors (which is unrelated to the particular referrers above) is that the survey recruits relatively more highly engaged EAs than people who are less engaged. In particular, we recruit many more people who are moderately-highly engaged (levels 3-5), and very few with lower engagement (e.g. have read EA materials for a few hours). I think this makes perfect sense, since we would expect people who are more engaged with the community to be more likely to take the survey.[1] Given this, you would likely want to consult the analyses we provide on differences between low/high engagement respondents to get a sense of the differences. Unfortunately, there’s no way to simply weight the results to get around this (as we could if we were doing a representative survey of the US public, for example), due to the simple fact that no-one knows what the real composition is of the full EA population including those with low engagement with EA.
The other area where recruitment methods are particularly significant is where the referrers to the survey are also categories of interest in the survey questions. For example, as we have discussed before, 80,000 Hours is a large referrer to the survey, which may influence responses to questions which mention 80K. In last year’s post, we therefore conducted an additional analysis, where we found that removing 80K-referred people, lowered the proportion of people selecting 80,000 Hours, though still let them among the most important sources (moving them from 50.7% to 45% of respondents). As we noted in that post though, simply looking at the proportion who were not referred to the survey from 80K (or other factor) is likely to be a wild over-correction for the fact that people influenced by that factor were more likely to be referred to the survey from that factor.
If there’s sufficient demand we could potentially produce another post looking at similar analyses, though my impression is that when we did so for last year’s EAS there wasn’t significant interest.
Though, as we note in the linked post, in practice, people may be primarily interested in the responses of those who are at least moderately engaged. Lower engagement would be someone who has e.g. “engaged with a few articles, videos, podcasts, discussions, events on effective altruism (e.g. reading Doing Good Better or spending ~5 hours on the website of 80,000 Hours)” but not necessarily engaged in any other way.