I can see how the non-response rate looks alarming and I definitely owe some context for that.
One thing we tried this year was a separate donations only survey, where people only reported their donations and a few other questions. Race was not on this slimmer survey. 554 did not answer this race question because they were never asked it.
Another source of apparent non-response is the fact that we asked people Yes or No for four different races (White, Black, Asian, and Hispanic). It looks like some people checked “Yes” for one race but did not explicitly check “No” for the others. This accounts for another 120 people.
Combining these first two reasons, there are only 67 people genuinely ignoring the race question. You then have to account for survey fatigue, where people answer some of the questions at the beginning of the survey, but then get bored, busy, or distracted and quit the survey without answering the rest of the questions. Given that race was at the bottom of the seventh page of the survey, this could be acute. I couldn’t find anyone who neglected to answer the race question but did answer a question after the race question, so it looks like these three factors may fully account for all non-response.
That’s still a very important point that doesn’t seem to have been made in the analysis here: the demographic questions were not included in the questions put to all respondents. Since there are good reasons to think that people taking the “full” and “donations only” survey will differ systematically (e.g. more likely to have been involved with EA for longer). If the non responses are not random that’s an important caveat on all these findings and very much limits any comparisons that can be done over time. I can’t seem to see it discussed in the post?
Yeah. I personally think that offering the donations only survey was a bad idea for the reason that you said and a few other reasons.
Even if everyone took the full survey, the non-response would still be pretty non-random—you still have to have the tenacity to persist to page seven, which I imagine correlates with being more involved in EA and you also have to have taken the survey in the first place, which we also know is not random. It would have been nice to not make this worse, though.
Also, could respondents not say anything about being e.g. Native American, Middle Eastern, or at least “Other”? I’m sure the structure of these questions has been thoroughly discussed in social sciences literature and I don’t think the options shown here are in line with the standard style.
These were not options that we presented, but people could implicitly answer “Other” by not answering “Yes” to any of the race questions. We’d be happy to revisit this question if you think we should include additional races or an explicit “Other” option.
I can see how the non-response rate looks alarming and I definitely owe some context for that.
One thing we tried this year was a separate donations only survey, where people only reported their donations and a few other questions. Race was not on this slimmer survey. 554 did not answer this race question because they were never asked it.
Another source of apparent non-response is the fact that we asked people Yes or No for four different races (White, Black, Asian, and Hispanic). It looks like some people checked “Yes” for one race but did not explicitly check “No” for the others. This accounts for another 120 people.
Combining these first two reasons, there are only 67 people genuinely ignoring the race question. You then have to account for survey fatigue, where people answer some of the questions at the beginning of the survey, but then get bored, busy, or distracted and quit the survey without answering the rest of the questions. Given that race was at the bottom of the seventh page of the survey, this could be acute. I couldn’t find anyone who neglected to answer the race question but did answer a question after the race question, so it looks like these three factors may fully account for all non-response.
That’s still a very important point that doesn’t seem to have been made in the analysis here: the demographic questions were not included in the questions put to all respondents. Since there are good reasons to think that people taking the “full” and “donations only” survey will differ systematically (e.g. more likely to have been involved with EA for longer). If the non responses are not random that’s an important caveat on all these findings and very much limits any comparisons that can be done over time. I can’t seem to see it discussed in the post?
Yeah. I personally think that offering the donations only survey was a bad idea for the reason that you said and a few other reasons.
Even if everyone took the full survey, the non-response would still be pretty non-random—you still have to have the tenacity to persist to page seven, which I imagine correlates with being more involved in EA and you also have to have taken the survey in the first place, which we also know is not random. It would have been nice to not make this worse, though.
Also, could respondents not say anything about being e.g. Native American, Middle Eastern, or at least “Other”? I’m sure the structure of these questions has been thoroughly discussed in social sciences literature and I don’t think the options shown here are in line with the standard style.
These were not options that we presented, but people could implicitly answer “Other” by not answering “Yes” to any of the race questions. We’d be happy to revisit this question if you think we should include additional races or an explicit “Other” option.