Are there any theories about what is driving the really high non response rate for race? Or any cross tabs about what groups or locations are more likely to have a non response for race? Racial demographics in EA is an important topic, and it’s a shame that we can’t get better data on it.
I can see how the non-response rate looks alarming and I definitely owe some context for that.
One thing we tried this year was a separate donations only survey, where people only reported their donations and a few other questions. Race was not on this slimmer survey. 554 did not answer this race question because they were never asked it.
Another source of apparent non-response is the fact that we asked people Yes or No for four different races (White, Black, Asian, and Hispanic). It looks like some people checked “Yes” for one race but did not explicitly check “No” for the others. This accounts for another 120 people.
Combining these first two reasons, there are only 67 people genuinely ignoring the race question. You then have to account for survey fatigue, where people answer some of the questions at the beginning of the survey, but then get bored, busy, or distracted and quit the survey without answering the rest of the questions. Given that race was at the bottom of the seventh page of the survey, this could be acute. I couldn’t find anyone who neglected to answer the race question but did answer a question after the race question, so it looks like these three factors may fully account for all non-response.
That’s still a very important point that doesn’t seem to have been made in the analysis here: the demographic questions were not included in the questions put to all respondents. Since there are good reasons to think that people taking the “full” and “donations only” survey will differ systematically (e.g. more likely to have been involved with EA for longer). If the non responses are not random that’s an important caveat on all these findings and very much limits any comparisons that can be done over time. I can’t seem to see it discussed in the post?
Yeah. I personally think that offering the donations only survey was a bad idea for the reason that you said and a few other reasons.
Even if everyone took the full survey, the non-response would still be pretty non-random—you still have to have the tenacity to persist to page seven, which I imagine correlates with being more involved in EA and you also have to have taken the survey in the first place, which we also know is not random. It would have been nice to not make this worse, though.
Also, could respondents not say anything about being e.g. Native American, Middle Eastern, or at least “Other”? I’m sure the structure of these questions has been thoroughly discussed in social sciences literature and I don’t think the options shown here are in line with the standard style.
These were not options that we presented, but people could implicitly answer “Other” by not answering “Yes” to any of the race questions. We’d be happy to revisit this question if you think we should include additional races or an explicit “Other” option.
Sorry to fixate on this, but I’ve just never seen non response rates this high before − 10% is high in most cases of surveys, 40% is absurd. Like, yes you always have groups who feel like the answers don’t accurately capture their reality, but given that you did allow for multiracial answers (and given the homogeneity of EA from a race stand point), this usually would be only a very small fraction of respondents. There’s also the population that, for lack of a better term, “don’t believe in race” and never answer this question, but given how small this population is in general, unless an absurdly high number of them are EAs then this should also only be a very small fraction.
I really, really hope this isn’t the explanation, but I could see at least some of these answers coming from the perspective of “I don’t think race is a problem in EA, and people should stop asking about it, so I’ll just not answer at all as a protest or something.” As someone who sees data collection as sacred, I would be appalled by this—so please, someone, for the sake of my sanity explain what could possibly drive a 40% non response rate that is not this.
I wondered if the oddly high portion of refusal to answer was ideological, too. I hope this isn’t the case and inclined to think it’s unlikely; though there seem to be some EAs who are wary of questions regarding this kind of diversity, as they reject it is something to be tackled within the movement, I would not have thought that proportion (or rather the proportion who hold the view to the point of refusing the data altogether) would be as large as this.
It feels rather optimistic to suggest this is an issue of categorisation e.g. though the answers on race were not exclusive, which would be obviously problematic, most race sections on this type of survey that I’m used to have a more fine-grained response options. However, it seems even if this were an issue it ought not cause problems for such a large proportion of respondents.
It’s maybe worth noting the comparative proportion of respondents who did not answer political leanings (42.7%). If nothing else I think the number of people who refused response on race would be more bizarre/worrying if there was a very high response rate on everything else. My first thought was that refusal to answer on political leanings is likely idealogical (wariness of EA being especially associated with any specific political position) but on second thoughts I wonder if this is more likely to be a category problem? (It may be that people are reluctant to select “other” because their political stance is subsumed within left/right/centre etc, but they feel it is not well described by these options...? However, I’m not confident how likely this is and do not have a background in the intricacies of data gathering—unlike, I assume, those and ReThink who put this together.)
Are there any theories about what is driving the really high non response rate for race? Or any cross tabs about what groups or locations are more likely to have a non response for race? Racial demographics in EA is an important topic, and it’s a shame that we can’t get better data on it.
I can see how the non-response rate looks alarming and I definitely owe some context for that.
One thing we tried this year was a separate donations only survey, where people only reported their donations and a few other questions. Race was not on this slimmer survey. 554 did not answer this race question because they were never asked it.
Another source of apparent non-response is the fact that we asked people Yes or No for four different races (White, Black, Asian, and Hispanic). It looks like some people checked “Yes” for one race but did not explicitly check “No” for the others. This accounts for another 120 people.
Combining these first two reasons, there are only 67 people genuinely ignoring the race question. You then have to account for survey fatigue, where people answer some of the questions at the beginning of the survey, but then get bored, busy, or distracted and quit the survey without answering the rest of the questions. Given that race was at the bottom of the seventh page of the survey, this could be acute. I couldn’t find anyone who neglected to answer the race question but did answer a question after the race question, so it looks like these three factors may fully account for all non-response.
That’s still a very important point that doesn’t seem to have been made in the analysis here: the demographic questions were not included in the questions put to all respondents. Since there are good reasons to think that people taking the “full” and “donations only” survey will differ systematically (e.g. more likely to have been involved with EA for longer). If the non responses are not random that’s an important caveat on all these findings and very much limits any comparisons that can be done over time. I can’t seem to see it discussed in the post?
Yeah. I personally think that offering the donations only survey was a bad idea for the reason that you said and a few other reasons.
Even if everyone took the full survey, the non-response would still be pretty non-random—you still have to have the tenacity to persist to page seven, which I imagine correlates with being more involved in EA and you also have to have taken the survey in the first place, which we also know is not random. It would have been nice to not make this worse, though.
Also, could respondents not say anything about being e.g. Native American, Middle Eastern, or at least “Other”? I’m sure the structure of these questions has been thoroughly discussed in social sciences literature and I don’t think the options shown here are in line with the standard style.
These were not options that we presented, but people could implicitly answer “Other” by not answering “Yes” to any of the race questions. We’d be happy to revisit this question if you think we should include additional races or an explicit “Other” option.
Sorry to fixate on this, but I’ve just never seen non response rates this high before − 10% is high in most cases of surveys, 40% is absurd. Like, yes you always have groups who feel like the answers don’t accurately capture their reality, but given that you did allow for multiracial answers (and given the homogeneity of EA from a race stand point), this usually would be only a very small fraction of respondents. There’s also the population that, for lack of a better term, “don’t believe in race” and never answer this question, but given how small this population is in general, unless an absurdly high number of them are EAs then this should also only be a very small fraction.
I really, really hope this isn’t the explanation, but I could see at least some of these answers coming from the perspective of “I don’t think race is a problem in EA, and people should stop asking about it, so I’ll just not answer at all as a protest or something.” As someone who sees data collection as sacred, I would be appalled by this—so please, someone, for the sake of my sanity explain what could possibly drive a 40% non response rate that is not this.
The answer looks to be pretty simple and unimportant, as I explain in this comment.
Oh thank goodness, I am glad it is nothing worrying. Thank you for clarifying!
I wondered if the oddly high portion of refusal to answer was ideological, too. I hope this isn’t the case and inclined to think it’s unlikely; though there seem to be some EAs who are wary of questions regarding this kind of diversity, as they reject it is something to be tackled within the movement, I would not have thought that proportion (or rather the proportion who hold the view to the point of refusing the data altogether) would be as large as this.
It feels rather optimistic to suggest this is an issue of categorisation e.g. though the answers on race were not exclusive, which would be obviously problematic, most race sections on this type of survey that I’m used to have a more fine-grained response options. However, it seems even if this were an issue it ought not cause problems for such a large proportion of respondents.
It’s maybe worth noting the comparative proportion of respondents who did not answer political leanings (42.7%). If nothing else I think the number of people who refused response on race would be more bizarre/worrying if there was a very high response rate on everything else. My first thought was that refusal to answer on political leanings is likely idealogical (wariness of EA being especially associated with any specific political position) but on second thoughts I wonder if this is more likely to be a category problem? (It may be that people are reluctant to select “other” because their political stance is subsumed within left/right/centre etc, but they feel it is not well described by these options...? However, I’m not confident how likely this is and do not have a background in the intricacies of data gathering—unlike, I assume, those and ReThink who put this together.)